Dec  6 04:00:29 np0005548916 kernel: Linux version 5.14.0-645.el9.x86_64 (mockbuild@x86-05.stream.rdu2.redhat.com) (gcc (GCC) 11.5.0 20240719 (Red Hat 11.5.0-14), GNU ld version 2.35.2-68.el9) #1 SMP PREEMPT_DYNAMIC Fri Nov 28 14:01:17 UTC 2025
Dec  6 04:00:29 np0005548916 kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com.
Dec  6 04:00:29 np0005548916 kernel: Command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-645.el9.x86_64 root=UUID=fcf6b761-831a-48a7-9f5f-068b5063763f ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Dec  6 04:00:29 np0005548916 kernel: BIOS-provided physical RAM map:
Dec  6 04:00:29 np0005548916 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable
Dec  6 04:00:29 np0005548916 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved
Dec  6 04:00:29 np0005548916 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved
Dec  6 04:00:29 np0005548916 kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable
Dec  6 04:00:29 np0005548916 kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved
Dec  6 04:00:29 np0005548916 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved
Dec  6 04:00:29 np0005548916 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved
Dec  6 04:00:29 np0005548916 kernel: BIOS-e820: [mem 0x0000000100000000-0x000000023fffffff] usable
Dec  6 04:00:29 np0005548916 kernel: NX (Execute Disable) protection: active
Dec  6 04:00:29 np0005548916 kernel: APIC: Static calls initialized
Dec  6 04:00:29 np0005548916 kernel: SMBIOS 2.8 present.
Dec  6 04:00:29 np0005548916 kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014
Dec  6 04:00:29 np0005548916 kernel: Hypervisor detected: KVM
Dec  6 04:00:29 np0005548916 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00
Dec  6 04:00:29 np0005548916 kernel: kvm-clock: using sched offset of 3146583811 cycles
Dec  6 04:00:29 np0005548916 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns
Dec  6 04:00:29 np0005548916 kernel: tsc: Detected 2800.000 MHz processor
Dec  6 04:00:29 np0005548916 kernel: last_pfn = 0x240000 max_arch_pfn = 0x400000000
Dec  6 04:00:29 np0005548916 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs
Dec  6 04:00:29 np0005548916 kernel: x86/PAT: Configuration [0-7]: WB  WC  UC- UC  WB  WP  UC- WT  
Dec  6 04:00:29 np0005548916 kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000
Dec  6 04:00:29 np0005548916 kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef]
Dec  6 04:00:29 np0005548916 kernel: Using GB pages for direct mapping
Dec  6 04:00:29 np0005548916 kernel: RAMDISK: [mem 0x2d472000-0x32a30fff]
Dec  6 04:00:29 np0005548916 kernel: ACPI: Early table checksum verification disabled
Dec  6 04:00:29 np0005548916 kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS )
Dec  6 04:00:29 np0005548916 kernel: ACPI: RSDT 0x00000000BFFE16BD 000030 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec  6 04:00:29 np0005548916 kernel: ACPI: FACP 0x00000000BFFE1571 000074 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec  6 04:00:29 np0005548916 kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F1 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec  6 04:00:29 np0005548916 kernel: ACPI: FACS 0x00000000BFFDFC40 000040
Dec  6 04:00:29 np0005548916 kernel: ACPI: APIC 0x00000000BFFE15E5 0000B0 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec  6 04:00:29 np0005548916 kernel: ACPI: WAET 0x00000000BFFE1695 000028 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec  6 04:00:29 np0005548916 kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1571-0xbffe15e4]
Dec  6 04:00:29 np0005548916 kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1570]
Dec  6 04:00:29 np0005548916 kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f]
Dec  6 04:00:29 np0005548916 kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15e5-0xbffe1694]
Dec  6 04:00:29 np0005548916 kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1695-0xbffe16bc]
Dec  6 04:00:29 np0005548916 kernel: No NUMA configuration found
Dec  6 04:00:29 np0005548916 kernel: Faking a node at [mem 0x0000000000000000-0x000000023fffffff]
Dec  6 04:00:29 np0005548916 kernel: NODE_DATA(0) allocated [mem 0x23ffd3000-0x23fffdfff]
Dec  6 04:00:29 np0005548916 kernel: crashkernel reserved: 0x00000000af000000 - 0x00000000bf000000 (256 MB)
Dec  6 04:00:29 np0005548916 kernel: Zone ranges:
Dec  6 04:00:29 np0005548916 kernel:  DMA      [mem 0x0000000000001000-0x0000000000ffffff]
Dec  6 04:00:29 np0005548916 kernel:  DMA32    [mem 0x0000000001000000-0x00000000ffffffff]
Dec  6 04:00:29 np0005548916 kernel:  Normal   [mem 0x0000000100000000-0x000000023fffffff]
Dec  6 04:00:29 np0005548916 kernel:  Device   empty
Dec  6 04:00:29 np0005548916 kernel: Movable zone start for each node
Dec  6 04:00:29 np0005548916 kernel: Early memory node ranges
Dec  6 04:00:29 np0005548916 kernel:  node   0: [mem 0x0000000000001000-0x000000000009efff]
Dec  6 04:00:29 np0005548916 kernel:  node   0: [mem 0x0000000000100000-0x00000000bffdafff]
Dec  6 04:00:29 np0005548916 kernel:  node   0: [mem 0x0000000100000000-0x000000023fffffff]
Dec  6 04:00:29 np0005548916 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000023fffffff]
Dec  6 04:00:29 np0005548916 kernel: On node 0, zone DMA: 1 pages in unavailable ranges
Dec  6 04:00:29 np0005548916 kernel: On node 0, zone DMA: 97 pages in unavailable ranges
Dec  6 04:00:29 np0005548916 kernel: On node 0, zone Normal: 37 pages in unavailable ranges
Dec  6 04:00:29 np0005548916 kernel: ACPI: PM-Timer IO Port: 0x608
Dec  6 04:00:29 np0005548916 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1])
Dec  6 04:00:29 np0005548916 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23
Dec  6 04:00:29 np0005548916 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl)
Dec  6 04:00:29 np0005548916 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level)
Dec  6 04:00:29 np0005548916 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level)
Dec  6 04:00:29 np0005548916 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level)
Dec  6 04:00:29 np0005548916 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level)
Dec  6 04:00:29 np0005548916 kernel: ACPI: Using ACPI (MADT) for SMP configuration information
Dec  6 04:00:29 np0005548916 kernel: TSC deadline timer available
Dec  6 04:00:29 np0005548916 kernel: CPU topo: Max. logical packages:   8
Dec  6 04:00:29 np0005548916 kernel: CPU topo: Max. logical dies:       8
Dec  6 04:00:29 np0005548916 kernel: CPU topo: Max. dies per package:   1
Dec  6 04:00:29 np0005548916 kernel: CPU topo: Max. threads per core:   1
Dec  6 04:00:29 np0005548916 kernel: CPU topo: Num. cores per package:     1
Dec  6 04:00:29 np0005548916 kernel: CPU topo: Num. threads per package:   1
Dec  6 04:00:29 np0005548916 kernel: CPU topo: Allowing 8 present CPUs plus 0 hotplug CPUs
Dec  6 04:00:29 np0005548916 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write()
Dec  6 04:00:29 np0005548916 kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff]
Dec  6 04:00:29 np0005548916 kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff]
Dec  6 04:00:29 np0005548916 kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff]
Dec  6 04:00:29 np0005548916 kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff]
Dec  6 04:00:29 np0005548916 kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff]
Dec  6 04:00:29 np0005548916 kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff]
Dec  6 04:00:29 np0005548916 kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff]
Dec  6 04:00:29 np0005548916 kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff]
Dec  6 04:00:29 np0005548916 kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff]
Dec  6 04:00:29 np0005548916 kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices
Dec  6 04:00:29 np0005548916 kernel: Booting paravirtualized kernel on KVM
Dec  6 04:00:29 np0005548916 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns
Dec  6 04:00:29 np0005548916 kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1
Dec  6 04:00:29 np0005548916 kernel: percpu: Embedded 64 pages/cpu s225280 r8192 d28672 u262144
Dec  6 04:00:29 np0005548916 kernel: kvm-guest: PV spinlocks disabled, no host support
Dec  6 04:00:29 np0005548916 kernel: Kernel command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-645.el9.x86_64 root=UUID=fcf6b761-831a-48a7-9f5f-068b5063763f ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Dec  6 04:00:29 np0005548916 kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-645.el9.x86_64", will be passed to user space.
Dec  6 04:00:29 np0005548916 kernel: random: crng init done
Dec  6 04:00:29 np0005548916 kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear)
Dec  6 04:00:29 np0005548916 kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear)
Dec  6 04:00:29 np0005548916 kernel: Fallback order for Node 0: 0 
Dec  6 04:00:29 np0005548916 kernel: Built 1 zonelists, mobility grouping on.  Total pages: 2064091
Dec  6 04:00:29 np0005548916 kernel: Policy zone: Normal
Dec  6 04:00:29 np0005548916 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off
Dec  6 04:00:29 np0005548916 kernel: software IO TLB: area num 8.
Dec  6 04:00:29 np0005548916 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1
Dec  6 04:00:29 np0005548916 kernel: ftrace: allocating 49335 entries in 193 pages
Dec  6 04:00:29 np0005548916 kernel: ftrace: allocated 193 pages with 3 groups
Dec  6 04:00:29 np0005548916 kernel: Dynamic Preempt: voluntary
Dec  6 04:00:29 np0005548916 kernel: rcu: Preemptible hierarchical RCU implementation.
Dec  6 04:00:29 np0005548916 kernel: rcu: #011RCU event tracing is enabled.
Dec  6 04:00:29 np0005548916 kernel: rcu: #011RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8.
Dec  6 04:00:29 np0005548916 kernel: #011Trampoline variant of Tasks RCU enabled.
Dec  6 04:00:29 np0005548916 kernel: #011Rude variant of Tasks RCU enabled.
Dec  6 04:00:29 np0005548916 kernel: #011Tracing variant of Tasks RCU enabled.
Dec  6 04:00:29 np0005548916 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies.
Dec  6 04:00:29 np0005548916 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8
Dec  6 04:00:29 np0005548916 kernel: RCU Tasks: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Dec  6 04:00:29 np0005548916 kernel: RCU Tasks Rude: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Dec  6 04:00:29 np0005548916 kernel: RCU Tasks Trace: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Dec  6 04:00:29 np0005548916 kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16
Dec  6 04:00:29 np0005548916 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention.
Dec  6 04:00:29 np0005548916 kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____)
Dec  6 04:00:29 np0005548916 kernel: Console: colour VGA+ 80x25
Dec  6 04:00:29 np0005548916 kernel: printk: console [ttyS0] enabled
Dec  6 04:00:29 np0005548916 kernel: ACPI: Core revision 20230331
Dec  6 04:00:29 np0005548916 kernel: APIC: Switch to symmetric I/O mode setup
Dec  6 04:00:29 np0005548916 kernel: x2apic enabled
Dec  6 04:00:29 np0005548916 kernel: APIC: Switched APIC routing to: physical x2apic
Dec  6 04:00:29 np0005548916 kernel: tsc: Marking TSC unstable due to TSCs unsynchronized
Dec  6 04:00:29 np0005548916 kernel: Calibrating delay loop (skipped) preset value.. 5600.00 BogoMIPS (lpj=2800000)
Dec  6 04:00:29 np0005548916 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated
Dec  6 04:00:29 np0005548916 kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127
Dec  6 04:00:29 np0005548916 kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0
Dec  6 04:00:29 np0005548916 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization
Dec  6 04:00:29 np0005548916 kernel: Spectre V2 : Mitigation: Retpolines
Dec  6 04:00:29 np0005548916 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT
Dec  6 04:00:29 np0005548916 kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls
Dec  6 04:00:29 np0005548916 kernel: RETBleed: Mitigation: untrained return thunk
Dec  6 04:00:29 np0005548916 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier
Dec  6 04:00:29 np0005548916 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl
Dec  6 04:00:29 np0005548916 kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied!
Dec  6 04:00:29 np0005548916 kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options.
Dec  6 04:00:29 np0005548916 kernel: x86/bugs: return thunk changed
Dec  6 04:00:29 np0005548916 kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode
Dec  6 04:00:29 np0005548916 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers'
Dec  6 04:00:29 np0005548916 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers'
Dec  6 04:00:29 np0005548916 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers'
Dec  6 04:00:29 np0005548916 kernel: x86/fpu: xstate_offset[2]:  576, xstate_sizes[2]:  256
Dec  6 04:00:29 np0005548916 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format.
Dec  6 04:00:29 np0005548916 kernel: Freeing SMP alternatives memory: 40K
Dec  6 04:00:29 np0005548916 kernel: pid_max: default: 32768 minimum: 301
Dec  6 04:00:29 np0005548916 kernel: LSM: initializing lsm=lockdown,capability,landlock,yama,integrity,selinux,bpf
Dec  6 04:00:29 np0005548916 kernel: landlock: Up and running.
Dec  6 04:00:29 np0005548916 kernel: Yama: becoming mindful.
Dec  6 04:00:29 np0005548916 kernel: SELinux:  Initializing.
Dec  6 04:00:29 np0005548916 kernel: LSM support for eBPF active
Dec  6 04:00:29 np0005548916 kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Dec  6 04:00:29 np0005548916 kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Dec  6 04:00:29 np0005548916 kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0)
Dec  6 04:00:29 np0005548916 kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver.
Dec  6 04:00:29 np0005548916 kernel: ... version:                0
Dec  6 04:00:29 np0005548916 kernel: ... bit width:              48
Dec  6 04:00:29 np0005548916 kernel: ... generic registers:      6
Dec  6 04:00:29 np0005548916 kernel: ... value mask:             0000ffffffffffff
Dec  6 04:00:29 np0005548916 kernel: ... max period:             00007fffffffffff
Dec  6 04:00:29 np0005548916 kernel: ... fixed-purpose events:   0
Dec  6 04:00:29 np0005548916 kernel: ... event mask:             000000000000003f
Dec  6 04:00:29 np0005548916 kernel: signal: max sigframe size: 1776
Dec  6 04:00:29 np0005548916 kernel: rcu: Hierarchical SRCU implementation.
Dec  6 04:00:29 np0005548916 kernel: rcu: #011Max phase no-delay instances is 400.
Dec  6 04:00:29 np0005548916 kernel: smp: Bringing up secondary CPUs ...
Dec  6 04:00:29 np0005548916 kernel: smpboot: x86: Booting SMP configuration:
Dec  6 04:00:29 np0005548916 kernel: .... node  #0, CPUs:      #1 #2 #3 #4 #5 #6 #7
Dec  6 04:00:29 np0005548916 kernel: smp: Brought up 1 node, 8 CPUs
Dec  6 04:00:29 np0005548916 kernel: smpboot: Total of 8 processors activated (44800.00 BogoMIPS)
Dec  6 04:00:29 np0005548916 kernel: node 0 deferred pages initialised in 10ms
Dec  6 04:00:29 np0005548916 kernel: Memory: 7763996K/8388068K available (16384K kernel code, 5795K rwdata, 13908K rodata, 4196K init, 7156K bss, 618212K reserved, 0K cma-reserved)
Dec  6 04:00:29 np0005548916 kernel: devtmpfs: initialized
Dec  6 04:00:29 np0005548916 kernel: x86/mm: Memory block size: 128MB
Dec  6 04:00:29 np0005548916 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns
Dec  6 04:00:29 np0005548916 kernel: futex hash table entries: 2048 (131072 bytes on 1 NUMA nodes, total 128 KiB, linear).
Dec  6 04:00:29 np0005548916 kernel: pinctrl core: initialized pinctrl subsystem
Dec  6 04:00:29 np0005548916 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family
Dec  6 04:00:29 np0005548916 kernel: DMA: preallocated 1024 KiB GFP_KERNEL pool for atomic allocations
Dec  6 04:00:29 np0005548916 kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations
Dec  6 04:00:29 np0005548916 kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations
Dec  6 04:00:29 np0005548916 kernel: audit: initializing netlink subsys (disabled)
Dec  6 04:00:29 np0005548916 kernel: audit: type=2000 audit(1765011627.390:1): state=initialized audit_enabled=0 res=1
Dec  6 04:00:29 np0005548916 kernel: thermal_sys: Registered thermal governor 'fair_share'
Dec  6 04:00:29 np0005548916 kernel: thermal_sys: Registered thermal governor 'step_wise'
Dec  6 04:00:29 np0005548916 kernel: thermal_sys: Registered thermal governor 'user_space'
Dec  6 04:00:29 np0005548916 kernel: cpuidle: using governor menu
Dec  6 04:00:29 np0005548916 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5
Dec  6 04:00:29 np0005548916 kernel: PCI: Using configuration type 1 for base access
Dec  6 04:00:29 np0005548916 kernel: PCI: Using configuration type 1 for extended access
Dec  6 04:00:29 np0005548916 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible.
Dec  6 04:00:29 np0005548916 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages
Dec  6 04:00:29 np0005548916 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page
Dec  6 04:00:29 np0005548916 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages
Dec  6 04:00:29 np0005548916 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page
Dec  6 04:00:29 np0005548916 kernel: Demotion targets for Node 0: null
Dec  6 04:00:29 np0005548916 kernel: cryptd: max_cpu_qlen set to 1000
Dec  6 04:00:29 np0005548916 kernel: ACPI: Added _OSI(Module Device)
Dec  6 04:00:29 np0005548916 kernel: ACPI: Added _OSI(Processor Device)
Dec  6 04:00:29 np0005548916 kernel: ACPI: Added _OSI(3.0 _SCP Extensions)
Dec  6 04:00:29 np0005548916 kernel: ACPI: Added _OSI(Processor Aggregator Device)
Dec  6 04:00:29 np0005548916 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded
Dec  6 04:00:29 np0005548916 kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC
Dec  6 04:00:29 np0005548916 kernel: ACPI: Interpreter enabled
Dec  6 04:00:29 np0005548916 kernel: ACPI: PM: (supports S0 S3 S4 S5)
Dec  6 04:00:29 np0005548916 kernel: ACPI: Using IOAPIC for interrupt routing
Dec  6 04:00:29 np0005548916 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug
Dec  6 04:00:29 np0005548916 kernel: PCI: Using E820 reservations for host bridge windows
Dec  6 04:00:29 np0005548916 kernel: ACPI: Enabled 2 GPEs in block 00 to 0F
Dec  6 04:00:29 np0005548916 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff])
Dec  6 04:00:29 np0005548916 kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3]
Dec  6 04:00:29 np0005548916 kernel: acpiphp: Slot [3] registered
Dec  6 04:00:29 np0005548916 kernel: acpiphp: Slot [4] registered
Dec  6 04:00:29 np0005548916 kernel: acpiphp: Slot [5] registered
Dec  6 04:00:29 np0005548916 kernel: acpiphp: Slot [6] registered
Dec  6 04:00:29 np0005548916 kernel: acpiphp: Slot [7] registered
Dec  6 04:00:29 np0005548916 kernel: acpiphp: Slot [8] registered
Dec  6 04:00:29 np0005548916 kernel: acpiphp: Slot [9] registered
Dec  6 04:00:29 np0005548916 kernel: acpiphp: Slot [10] registered
Dec  6 04:00:29 np0005548916 kernel: acpiphp: Slot [11] registered
Dec  6 04:00:29 np0005548916 kernel: acpiphp: Slot [12] registered
Dec  6 04:00:29 np0005548916 kernel: acpiphp: Slot [13] registered
Dec  6 04:00:29 np0005548916 kernel: acpiphp: Slot [14] registered
Dec  6 04:00:29 np0005548916 kernel: acpiphp: Slot [15] registered
Dec  6 04:00:29 np0005548916 kernel: acpiphp: Slot [16] registered
Dec  6 04:00:29 np0005548916 kernel: acpiphp: Slot [17] registered
Dec  6 04:00:29 np0005548916 kernel: acpiphp: Slot [18] registered
Dec  6 04:00:29 np0005548916 kernel: acpiphp: Slot [19] registered
Dec  6 04:00:29 np0005548916 kernel: acpiphp: Slot [20] registered
Dec  6 04:00:29 np0005548916 kernel: acpiphp: Slot [21] registered
Dec  6 04:00:29 np0005548916 kernel: acpiphp: Slot [22] registered
Dec  6 04:00:29 np0005548916 kernel: acpiphp: Slot [23] registered
Dec  6 04:00:29 np0005548916 kernel: acpiphp: Slot [24] registered
Dec  6 04:00:29 np0005548916 kernel: acpiphp: Slot [25] registered
Dec  6 04:00:29 np0005548916 kernel: acpiphp: Slot [26] registered
Dec  6 04:00:29 np0005548916 kernel: acpiphp: Slot [27] registered
Dec  6 04:00:29 np0005548916 kernel: acpiphp: Slot [28] registered
Dec  6 04:00:29 np0005548916 kernel: acpiphp: Slot [29] registered
Dec  6 04:00:29 np0005548916 kernel: acpiphp: Slot [30] registered
Dec  6 04:00:29 np0005548916 kernel: acpiphp: Slot [31] registered
Dec  6 04:00:29 np0005548916 kernel: PCI host bridge to bus 0000:00
Dec  6 04:00:29 np0005548916 kernel: pci_bus 0000:00: root bus resource [io  0x0000-0x0cf7 window]
Dec  6 04:00:29 np0005548916 kernel: pci_bus 0000:00: root bus resource [io  0x0d00-0xffff window]
Dec  6 04:00:29 np0005548916 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window]
Dec  6 04:00:29 np0005548916 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window]
Dec  6 04:00:29 np0005548916 kernel: pci_bus 0000:00: root bus resource [mem 0x240000000-0x2bfffffff window]
Dec  6 04:00:29 np0005548916 kernel: pci_bus 0000:00: root bus resource [bus 00-ff]
Dec  6 04:00:29 np0005548916 kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 conventional PCI endpoint
Dec  6 04:00:29 np0005548916 kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 conventional PCI endpoint
Dec  6 04:00:29 np0005548916 kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 conventional PCI endpoint
Dec  6 04:00:29 np0005548916 kernel: pci 0000:00:01.1: BAR 4 [io  0xc140-0xc14f]
Dec  6 04:00:29 np0005548916 kernel: pci 0000:00:01.1: BAR 0 [io  0x01f0-0x01f7]: legacy IDE quirk
Dec  6 04:00:29 np0005548916 kernel: pci 0000:00:01.1: BAR 1 [io  0x03f6]: legacy IDE quirk
Dec  6 04:00:29 np0005548916 kernel: pci 0000:00:01.1: BAR 2 [io  0x0170-0x0177]: legacy IDE quirk
Dec  6 04:00:29 np0005548916 kernel: pci 0000:00:01.1: BAR 3 [io  0x0376]: legacy IDE quirk
Dec  6 04:00:29 np0005548916 kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint
Dec  6 04:00:29 np0005548916 kernel: pci 0000:00:01.2: BAR 4 [io  0xc100-0xc11f]
Dec  6 04:00:29 np0005548916 kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint
Dec  6 04:00:29 np0005548916 kernel: pci 0000:00:01.3: quirk: [io  0x0600-0x063f] claimed by PIIX4 ACPI
Dec  6 04:00:29 np0005548916 kernel: pci 0000:00:01.3: quirk: [io  0x0700-0x070f] claimed by PIIX4 SMB
Dec  6 04:00:29 np0005548916 kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint
Dec  6 04:00:29 np0005548916 kernel: pci 0000:00:02.0: BAR 0 [mem 0xfe000000-0xfe7fffff pref]
Dec  6 04:00:29 np0005548916 kernel: pci 0000:00:02.0: BAR 2 [mem 0xfe800000-0xfe803fff 64bit pref]
Dec  6 04:00:29 np0005548916 kernel: pci 0000:00:02.0: BAR 4 [mem 0xfeb90000-0xfeb90fff]
Dec  6 04:00:29 np0005548916 kernel: pci 0000:00:02.0: ROM [mem 0xfeb80000-0xfeb8ffff pref]
Dec  6 04:00:29 np0005548916 kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff]
Dec  6 04:00:29 np0005548916 kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Dec  6 04:00:29 np0005548916 kernel: pci 0000:00:03.0: BAR 0 [io  0xc080-0xc0bf]
Dec  6 04:00:29 np0005548916 kernel: pci 0000:00:03.0: BAR 1 [mem 0xfeb91000-0xfeb91fff]
Dec  6 04:00:29 np0005548916 kernel: pci 0000:00:03.0: BAR 4 [mem 0xfe804000-0xfe807fff 64bit pref]
Dec  6 04:00:29 np0005548916 kernel: pci 0000:00:03.0: ROM [mem 0xfeb00000-0xfeb7ffff pref]
Dec  6 04:00:29 np0005548916 kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint
Dec  6 04:00:29 np0005548916 kernel: pci 0000:00:04.0: BAR 0 [io  0xc000-0xc07f]
Dec  6 04:00:29 np0005548916 kernel: pci 0000:00:04.0: BAR 1 [mem 0xfeb92000-0xfeb92fff]
Dec  6 04:00:29 np0005548916 kernel: pci 0000:00:04.0: BAR 4 [mem 0xfe808000-0xfe80bfff 64bit pref]
Dec  6 04:00:29 np0005548916 kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 conventional PCI endpoint
Dec  6 04:00:29 np0005548916 kernel: pci 0000:00:05.0: BAR 0 [io  0xc0c0-0xc0ff]
Dec  6 04:00:29 np0005548916 kernel: pci 0000:00:05.0: BAR 4 [mem 0xfe80c000-0xfe80ffff 64bit pref]
Dec  6 04:00:29 np0005548916 kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint
Dec  6 04:00:29 np0005548916 kernel: pci 0000:00:06.0: BAR 0 [io  0xc120-0xc13f]
Dec  6 04:00:29 np0005548916 kernel: pci 0000:00:06.0: BAR 4 [mem 0xfe810000-0xfe813fff 64bit pref]
Dec  6 04:00:29 np0005548916 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10
Dec  6 04:00:29 np0005548916 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10
Dec  6 04:00:29 np0005548916 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11
Dec  6 04:00:29 np0005548916 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11
Dec  6 04:00:29 np0005548916 kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9
Dec  6 04:00:29 np0005548916 kernel: iommu: Default domain type: Translated
Dec  6 04:00:29 np0005548916 kernel: iommu: DMA domain TLB invalidation policy: lazy mode
Dec  6 04:00:29 np0005548916 kernel: SCSI subsystem initialized
Dec  6 04:00:29 np0005548916 kernel: ACPI: bus type USB registered
Dec  6 04:00:29 np0005548916 kernel: usbcore: registered new interface driver usbfs
Dec  6 04:00:29 np0005548916 kernel: usbcore: registered new interface driver hub
Dec  6 04:00:29 np0005548916 kernel: usbcore: registered new device driver usb
Dec  6 04:00:29 np0005548916 kernel: pps_core: LinuxPPS API ver. 1 registered
Dec  6 04:00:29 np0005548916 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
Dec  6 04:00:29 np0005548916 kernel: PTP clock support registered
Dec  6 04:00:29 np0005548916 kernel: EDAC MC: Ver: 3.0.0
Dec  6 04:00:29 np0005548916 kernel: NetLabel: Initializing
Dec  6 04:00:29 np0005548916 kernel: NetLabel:  domain hash size = 128
Dec  6 04:00:29 np0005548916 kernel: NetLabel:  protocols = UNLABELED CIPSOv4 CALIPSO
Dec  6 04:00:29 np0005548916 kernel: NetLabel:  unlabeled traffic allowed by default
Dec  6 04:00:29 np0005548916 kernel: PCI: Using ACPI for IRQ routing
Dec  6 04:00:29 np0005548916 kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device
Dec  6 04:00:29 np0005548916 kernel: pci 0000:00:02.0: vgaarb: bridge control possible
Dec  6 04:00:29 np0005548916 kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none
Dec  6 04:00:29 np0005548916 kernel: vgaarb: loaded
Dec  6 04:00:29 np0005548916 kernel: clocksource: Switched to clocksource kvm-clock
Dec  6 04:00:29 np0005548916 kernel: VFS: Disk quotas dquot_6.6.0
Dec  6 04:00:29 np0005548916 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes)
Dec  6 04:00:29 np0005548916 kernel: pnp: PnP ACPI init
Dec  6 04:00:29 np0005548916 kernel: pnp: PnP ACPI: found 5 devices
Dec  6 04:00:29 np0005548916 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns
Dec  6 04:00:29 np0005548916 kernel: NET: Registered PF_INET protocol family
Dec  6 04:00:29 np0005548916 kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear)
Dec  6 04:00:29 np0005548916 kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear)
Dec  6 04:00:29 np0005548916 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear)
Dec  6 04:00:29 np0005548916 kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear)
Dec  6 04:00:29 np0005548916 kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear)
Dec  6 04:00:29 np0005548916 kernel: TCP: Hash tables configured (established 65536 bind 65536)
Dec  6 04:00:29 np0005548916 kernel: MPTCP token hash table entries: 8192 (order: 5, 196608 bytes, linear)
Dec  6 04:00:29 np0005548916 kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear)
Dec  6 04:00:29 np0005548916 kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear)
Dec  6 04:00:29 np0005548916 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family
Dec  6 04:00:29 np0005548916 kernel: NET: Registered PF_XDP protocol family
Dec  6 04:00:29 np0005548916 kernel: pci_bus 0000:00: resource 4 [io  0x0000-0x0cf7 window]
Dec  6 04:00:29 np0005548916 kernel: pci_bus 0000:00: resource 5 [io  0x0d00-0xffff window]
Dec  6 04:00:29 np0005548916 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window]
Dec  6 04:00:29 np0005548916 kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window]
Dec  6 04:00:29 np0005548916 kernel: pci_bus 0000:00: resource 8 [mem 0x240000000-0x2bfffffff window]
Dec  6 04:00:29 np0005548916 kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release
Dec  6 04:00:29 np0005548916 kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers
Dec  6 04:00:29 np0005548916 kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11
Dec  6 04:00:29 np0005548916 kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x160 took 81127 usecs
Dec  6 04:00:29 np0005548916 kernel: PCI: CLS 0 bytes, default 64
Dec  6 04:00:29 np0005548916 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB)
Dec  6 04:00:29 np0005548916 kernel: software IO TLB: mapped [mem 0x00000000ab000000-0x00000000af000000] (64MB)
Dec  6 04:00:29 np0005548916 kernel: ACPI: bus type thunderbolt registered
Dec  6 04:00:29 np0005548916 kernel: Trying to unpack rootfs image as initramfs...
Dec  6 04:00:29 np0005548916 kernel: Initialise system trusted keyrings
Dec  6 04:00:29 np0005548916 kernel: Key type blacklist registered
Dec  6 04:00:29 np0005548916 kernel: workingset: timestamp_bits=36 max_order=21 bucket_order=0
Dec  6 04:00:29 np0005548916 kernel: zbud: loaded
Dec  6 04:00:29 np0005548916 kernel: integrity: Platform Keyring initialized
Dec  6 04:00:29 np0005548916 kernel: integrity: Machine keyring initialized
Dec  6 04:00:29 np0005548916 kernel: Freeing initrd memory: 87804K
Dec  6 04:00:29 np0005548916 kernel: NET: Registered PF_ALG protocol family
Dec  6 04:00:29 np0005548916 kernel: xor: automatically using best checksumming function   avx       
Dec  6 04:00:29 np0005548916 kernel: Key type asymmetric registered
Dec  6 04:00:29 np0005548916 kernel: Asymmetric key parser 'x509' registered
Dec  6 04:00:29 np0005548916 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246)
Dec  6 04:00:29 np0005548916 kernel: io scheduler mq-deadline registered
Dec  6 04:00:29 np0005548916 kernel: io scheduler kyber registered
Dec  6 04:00:29 np0005548916 kernel: io scheduler bfq registered
Dec  6 04:00:29 np0005548916 kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE
Dec  6 04:00:29 np0005548916 kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4
Dec  6 04:00:29 np0005548916 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0
Dec  6 04:00:29 np0005548916 kernel: ACPI: button: Power Button [PWRF]
Dec  6 04:00:29 np0005548916 kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10
Dec  6 04:00:29 np0005548916 kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11
Dec  6 04:00:29 np0005548916 kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10
Dec  6 04:00:29 np0005548916 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled
Dec  6 04:00:29 np0005548916 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A
Dec  6 04:00:29 np0005548916 kernel: Non-volatile memory driver v1.3
Dec  6 04:00:29 np0005548916 kernel: rdac: device handler registered
Dec  6 04:00:29 np0005548916 kernel: hp_sw: device handler registered
Dec  6 04:00:29 np0005548916 kernel: emc: device handler registered
Dec  6 04:00:29 np0005548916 kernel: alua: device handler registered
Dec  6 04:00:29 np0005548916 kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller
Dec  6 04:00:29 np0005548916 kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1
Dec  6 04:00:29 np0005548916 kernel: uhci_hcd 0000:00:01.2: detected 2 ports
Dec  6 04:00:29 np0005548916 kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c100
Dec  6 04:00:29 np0005548916 kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14
Dec  6 04:00:29 np0005548916 kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1
Dec  6 04:00:29 np0005548916 kernel: usb usb1: Product: UHCI Host Controller
Dec  6 04:00:29 np0005548916 kernel: usb usb1: Manufacturer: Linux 5.14.0-645.el9.x86_64 uhci_hcd
Dec  6 04:00:29 np0005548916 kernel: usb usb1: SerialNumber: 0000:00:01.2
Dec  6 04:00:29 np0005548916 kernel: hub 1-0:1.0: USB hub found
Dec  6 04:00:29 np0005548916 kernel: hub 1-0:1.0: 2 ports detected
Dec  6 04:00:29 np0005548916 kernel: usbcore: registered new interface driver usbserial_generic
Dec  6 04:00:29 np0005548916 kernel: usbserial: USB Serial support registered for generic
Dec  6 04:00:29 np0005548916 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12
Dec  6 04:00:29 np0005548916 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1
Dec  6 04:00:29 np0005548916 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12
Dec  6 04:00:29 np0005548916 kernel: mousedev: PS/2 mouse device common for all mice
Dec  6 04:00:29 np0005548916 kernel: rtc_cmos 00:04: RTC can wake from S4
Dec  6 04:00:29 np0005548916 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1
Dec  6 04:00:29 np0005548916 kernel: rtc_cmos 00:04: registered as rtc0
Dec  6 04:00:29 np0005548916 kernel: rtc_cmos 00:04: setting system clock to 2025-12-06T09:00:28 UTC (1765011628)
Dec  6 04:00:29 np0005548916 kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram
Dec  6 04:00:29 np0005548916 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled
Dec  6 04:00:29 np0005548916 kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4
Dec  6 04:00:29 np0005548916 kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3
Dec  6 04:00:29 np0005548916 kernel: hid: raw HID events driver (C) Jiri Kosina
Dec  6 04:00:29 np0005548916 kernel: usbcore: registered new interface driver usbhid
Dec  6 04:00:29 np0005548916 kernel: usbhid: USB HID core driver
Dec  6 04:00:29 np0005548916 kernel: drop_monitor: Initializing network drop monitor service
Dec  6 04:00:29 np0005548916 kernel: Initializing XFRM netlink socket
Dec  6 04:00:29 np0005548916 kernel: NET: Registered PF_INET6 protocol family
Dec  6 04:00:29 np0005548916 kernel: Segment Routing with IPv6
Dec  6 04:00:29 np0005548916 kernel: NET: Registered PF_PACKET protocol family
Dec  6 04:00:29 np0005548916 kernel: mpls_gso: MPLS GSO support
Dec  6 04:00:29 np0005548916 kernel: IPI shorthand broadcast: enabled
Dec  6 04:00:29 np0005548916 kernel: AVX2 version of gcm_enc/dec engaged.
Dec  6 04:00:29 np0005548916 kernel: AES CTR mode by8 optimization enabled
Dec  6 04:00:29 np0005548916 kernel: sched_clock: Marking stable (1213005399, 145039410)->(1440318279, -82273470)
Dec  6 04:00:29 np0005548916 kernel: registered taskstats version 1
Dec  6 04:00:29 np0005548916 kernel: Loading compiled-in X.509 certificates
Dec  6 04:00:29 np0005548916 kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 4c28336b4850d771d036b52fb2778fdb4f02f708'
Dec  6 04:00:29 np0005548916 kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80'
Dec  6 04:00:29 np0005548916 kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8'
Dec  6 04:00:29 np0005548916 kernel: Loaded X.509 cert 'RH-IMA-CA: Red Hat IMA CA: fb31825dd0e073685b264e3038963673f753959a'
Dec  6 04:00:29 np0005548916 kernel: Loaded X.509 cert 'Nvidia GPU OOT signing 001: 55e1cef88193e60419f0b0ec379c49f77545acf0'
Dec  6 04:00:29 np0005548916 kernel: Demotion targets for Node 0: null
Dec  6 04:00:29 np0005548916 kernel: page_owner is disabled
Dec  6 04:00:29 np0005548916 kernel: Key type .fscrypt registered
Dec  6 04:00:29 np0005548916 kernel: Key type fscrypt-provisioning registered
Dec  6 04:00:29 np0005548916 kernel: Key type big_key registered
Dec  6 04:00:29 np0005548916 kernel: Key type encrypted registered
Dec  6 04:00:29 np0005548916 kernel: ima: No TPM chip found, activating TPM-bypass!
Dec  6 04:00:29 np0005548916 kernel: Loading compiled-in module X.509 certificates
Dec  6 04:00:29 np0005548916 kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 4c28336b4850d771d036b52fb2778fdb4f02f708'
Dec  6 04:00:29 np0005548916 kernel: ima: Allocated hash algorithm: sha256
Dec  6 04:00:29 np0005548916 kernel: ima: No architecture policies found
Dec  6 04:00:29 np0005548916 kernel: evm: Initialising EVM extended attributes:
Dec  6 04:00:29 np0005548916 kernel: evm: security.selinux
Dec  6 04:00:29 np0005548916 kernel: evm: security.SMACK64 (disabled)
Dec  6 04:00:29 np0005548916 kernel: evm: security.SMACK64EXEC (disabled)
Dec  6 04:00:29 np0005548916 kernel: evm: security.SMACK64TRANSMUTE (disabled)
Dec  6 04:00:29 np0005548916 kernel: evm: security.SMACK64MMAP (disabled)
Dec  6 04:00:29 np0005548916 kernel: evm: security.apparmor (disabled)
Dec  6 04:00:29 np0005548916 kernel: evm: security.ima
Dec  6 04:00:29 np0005548916 kernel: evm: security.capability
Dec  6 04:00:29 np0005548916 kernel: evm: HMAC attrs: 0x1
Dec  6 04:00:29 np0005548916 kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd
Dec  6 04:00:29 np0005548916 kernel: Running certificate verification RSA selftest
Dec  6 04:00:29 np0005548916 kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db'
Dec  6 04:00:29 np0005548916 kernel: Running certificate verification ECDSA selftest
Dec  6 04:00:29 np0005548916 kernel: Loaded X.509 cert 'Certificate verification ECDSA self-testing key: 2900bcea1deb7bc8479a84a23d758efdfdd2b2d3'
Dec  6 04:00:29 np0005548916 kernel: clk: Disabling unused clocks
Dec  6 04:00:29 np0005548916 kernel: Freeing unused decrypted memory: 2028K
Dec  6 04:00:29 np0005548916 kernel: Freeing unused kernel image (initmem) memory: 4196K
Dec  6 04:00:29 np0005548916 kernel: Write protecting the kernel read-only data: 30720k
Dec  6 04:00:29 np0005548916 kernel: Freeing unused kernel image (rodata/data gap) memory: 428K
Dec  6 04:00:29 np0005548916 kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found.
Dec  6 04:00:29 np0005548916 kernel: Run /init as init process
Dec  6 04:00:29 np0005548916 systemd: systemd 252-59.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Dec  6 04:00:29 np0005548916 systemd: Detected virtualization kvm.
Dec  6 04:00:29 np0005548916 systemd: Detected architecture x86-64.
Dec  6 04:00:29 np0005548916 systemd: Running in initrd.
Dec  6 04:00:29 np0005548916 kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00
Dec  6 04:00:29 np0005548916 kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10
Dec  6 04:00:29 np0005548916 kernel: usb 1-1: Product: QEMU USB Tablet
Dec  6 04:00:29 np0005548916 kernel: usb 1-1: Manufacturer: QEMU
Dec  6 04:00:29 np0005548916 kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1
Dec  6 04:00:29 np0005548916 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5
Dec  6 04:00:29 np0005548916 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0
Dec  6 04:00:29 np0005548916 systemd: No hostname configured, using default hostname.
Dec  6 04:00:29 np0005548916 systemd: Hostname set to <localhost>.
Dec  6 04:00:29 np0005548916 systemd: Initializing machine ID from VM UUID.
Dec  6 04:00:29 np0005548916 systemd: Queued start job for default target Initrd Default Target.
Dec  6 04:00:29 np0005548916 systemd: Started Dispatch Password Requests to Console Directory Watch.
Dec  6 04:00:29 np0005548916 systemd: Reached target Local Encrypted Volumes.
Dec  6 04:00:29 np0005548916 systemd: Reached target Initrd /usr File System.
Dec  6 04:00:29 np0005548916 systemd: Reached target Local File Systems.
Dec  6 04:00:29 np0005548916 systemd: Reached target Path Units.
Dec  6 04:00:29 np0005548916 systemd: Reached target Slice Units.
Dec  6 04:00:29 np0005548916 systemd: Reached target Swaps.
Dec  6 04:00:29 np0005548916 systemd: Reached target Timer Units.
Dec  6 04:00:29 np0005548916 systemd: Listening on D-Bus System Message Bus Socket.
Dec  6 04:00:29 np0005548916 systemd: Listening on Journal Socket (/dev/log).
Dec  6 04:00:29 np0005548916 systemd: Listening on Journal Socket.
Dec  6 04:00:29 np0005548916 systemd: Listening on udev Control Socket.
Dec  6 04:00:29 np0005548916 systemd: Listening on udev Kernel Socket.
Dec  6 04:00:29 np0005548916 systemd: Reached target Socket Units.
Dec  6 04:00:29 np0005548916 systemd: Starting Create List of Static Device Nodes...
Dec  6 04:00:29 np0005548916 systemd: Starting Journal Service...
Dec  6 04:00:29 np0005548916 systemd: Load Kernel Modules was skipped because no trigger condition checks were met.
Dec  6 04:00:29 np0005548916 systemd: Starting Apply Kernel Variables...
Dec  6 04:00:29 np0005548916 systemd: Starting Create System Users...
Dec  6 04:00:29 np0005548916 systemd: Starting Setup Virtual Console...
Dec  6 04:00:29 np0005548916 systemd: Finished Create List of Static Device Nodes.
Dec  6 04:00:29 np0005548916 systemd: Finished Apply Kernel Variables.
Dec  6 04:00:29 np0005548916 systemd: Finished Create System Users.
Dec  6 04:00:29 np0005548916 systemd-journald[309]: Journal started
Dec  6 04:00:29 np0005548916 systemd-journald[309]: Runtime Journal (/run/log/journal/9a5f3f62e1ed4c638d00a3c5e56bbddc) is 8.0M, max 153.6M, 145.6M free.
Dec  6 04:00:29 np0005548916 systemd-sysusers[312]: Creating group 'users' with GID 100.
Dec  6 04:00:29 np0005548916 systemd-sysusers[312]: Creating group 'dbus' with GID 81.
Dec  6 04:00:29 np0005548916 systemd-sysusers[312]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81.
Dec  6 04:00:29 np0005548916 systemd: Started Journal Service.
Dec  6 04:00:29 np0005548916 systemd[1]: Starting Create Static Device Nodes in /dev...
Dec  6 04:00:29 np0005548916 systemd[1]: Starting Create Volatile Files and Directories...
Dec  6 04:00:29 np0005548916 systemd[1]: Finished Create Static Device Nodes in /dev.
Dec  6 04:00:29 np0005548916 systemd[1]: Finished Setup Virtual Console.
Dec  6 04:00:29 np0005548916 systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met.
Dec  6 04:00:29 np0005548916 systemd[1]: Starting dracut cmdline hook...
Dec  6 04:00:29 np0005548916 systemd[1]: Finished Create Volatile Files and Directories.
Dec  6 04:00:29 np0005548916 dracut-cmdline[327]: dracut-9 dracut-057-102.git20250818.el9
Dec  6 04:00:30 np0005548916 dracut-cmdline[327]: Using kernel command line parameters:    BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-645.el9.x86_64 root=UUID=fcf6b761-831a-48a7-9f5f-068b5063763f ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Dec  6 04:00:30 np0005548916 systemd[1]: Finished dracut cmdline hook.
Dec  6 04:00:30 np0005548916 systemd[1]: Starting dracut pre-udev hook...
Dec  6 04:00:30 np0005548916 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
Dec  6 04:00:30 np0005548916 kernel: device-mapper: uevent: version 1.0.3
Dec  6 04:00:30 np0005548916 kernel: device-mapper: ioctl: 4.50.0-ioctl (2025-04-28) initialised: dm-devel@lists.linux.dev
Dec  6 04:00:30 np0005548916 kernel: RPC: Registered named UNIX socket transport module.
Dec  6 04:00:30 np0005548916 kernel: RPC: Registered udp transport module.
Dec  6 04:00:30 np0005548916 kernel: RPC: Registered tcp transport module.
Dec  6 04:00:30 np0005548916 kernel: RPC: Registered tcp-with-tls transport module.
Dec  6 04:00:30 np0005548916 kernel: RPC: Registered tcp NFSv4.1 backchannel transport module.
Dec  6 04:00:30 np0005548916 rpc.statd[443]: Version 2.5.4 starting
Dec  6 04:00:30 np0005548916 rpc.statd[443]: Initializing NSM state
Dec  6 04:00:30 np0005548916 rpc.idmapd[448]: Setting log level to 0
Dec  6 04:00:30 np0005548916 systemd[1]: Finished dracut pre-udev hook.
Dec  6 04:00:30 np0005548916 systemd[1]: Starting Rule-based Manager for Device Events and Files...
Dec  6 04:00:30 np0005548916 systemd-udevd[461]: Using default interface naming scheme 'rhel-9.0'.
Dec  6 04:00:30 np0005548916 systemd[1]: Started Rule-based Manager for Device Events and Files.
Dec  6 04:00:30 np0005548916 systemd[1]: Starting dracut pre-trigger hook...
Dec  6 04:00:30 np0005548916 systemd[1]: Finished dracut pre-trigger hook.
Dec  6 04:00:30 np0005548916 systemd[1]: Starting Coldplug All udev Devices...
Dec  6 04:00:30 np0005548916 systemd[1]: Created slice Slice /system/modprobe.
Dec  6 04:00:30 np0005548916 systemd[1]: Starting Load Kernel Module configfs...
Dec  6 04:00:30 np0005548916 systemd[1]: Finished Coldplug All udev Devices.
Dec  6 04:00:30 np0005548916 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Dec  6 04:00:30 np0005548916 systemd[1]: Finished Load Kernel Module configfs.
Dec  6 04:00:30 np0005548916 systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Dec  6 04:00:30 np0005548916 systemd[1]: Reached target Network.
Dec  6 04:00:30 np0005548916 systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Dec  6 04:00:30 np0005548916 systemd[1]: Starting dracut initqueue hook...
Dec  6 04:00:30 np0005548916 kernel: virtio_blk virtio2: 8/0/0 default/read/poll queues
Dec  6 04:00:30 np0005548916 kernel: virtio_blk virtio2: [vda] 167772160 512-byte logical blocks (85.9 GB/80.0 GiB)
Dec  6 04:00:30 np0005548916 kernel: vda: vda1
Dec  6 04:00:30 np0005548916 systemd-udevd[501]: Network interface NamePolicy= disabled on kernel command line.
Dec  6 04:00:30 np0005548916 kernel: scsi host0: ata_piix
Dec  6 04:00:30 np0005548916 systemd[1]: Mounting Kernel Configuration File System...
Dec  6 04:00:30 np0005548916 kernel: scsi host1: ata_piix
Dec  6 04:00:30 np0005548916 kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc140 irq 14 lpm-pol 0
Dec  6 04:00:30 np0005548916 kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc148 irq 15 lpm-pol 0
Dec  6 04:00:30 np0005548916 systemd[1]: Mounted Kernel Configuration File System.
Dec  6 04:00:30 np0005548916 systemd[1]: Found device /dev/disk/by-uuid/fcf6b761-831a-48a7-9f5f-068b5063763f.
Dec  6 04:00:30 np0005548916 systemd[1]: Reached target Initrd Root Device.
Dec  6 04:00:30 np0005548916 systemd[1]: Reached target System Initialization.
Dec  6 04:00:30 np0005548916 systemd[1]: Reached target Basic System.
Dec  6 04:00:30 np0005548916 kernel: ata1: found unknown device (class 0)
Dec  6 04:00:30 np0005548916 kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100
Dec  6 04:00:30 np0005548916 kernel: scsi 0:0:0:0: CD-ROM            QEMU     QEMU DVD-ROM     2.5+ PQ: 0 ANSI: 5
Dec  6 04:00:30 np0005548916 kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5
Dec  6 04:00:31 np0005548916 kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray
Dec  6 04:00:31 np0005548916 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20
Dec  6 04:00:31 np0005548916 systemd[1]: Finished dracut initqueue hook.
Dec  6 04:00:31 np0005548916 systemd[1]: Reached target Preparation for Remote File Systems.
Dec  6 04:00:31 np0005548916 systemd[1]: Reached target Remote Encrypted Volumes.
Dec  6 04:00:31 np0005548916 systemd[1]: Reached target Remote File Systems.
Dec  6 04:00:31 np0005548916 systemd[1]: Starting dracut pre-mount hook...
Dec  6 04:00:31 np0005548916 systemd[1]: Finished dracut pre-mount hook.
Dec  6 04:00:31 np0005548916 systemd[1]: Starting File System Check on /dev/disk/by-uuid/fcf6b761-831a-48a7-9f5f-068b5063763f...
Dec  6 04:00:31 np0005548916 systemd-fsck[556]: /usr/sbin/fsck.xfs: XFS file system.
Dec  6 04:00:31 np0005548916 systemd[1]: Finished File System Check on /dev/disk/by-uuid/fcf6b761-831a-48a7-9f5f-068b5063763f.
Dec  6 04:00:31 np0005548916 systemd[1]: Mounting /sysroot...
Dec  6 04:00:31 np0005548916 kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled
Dec  6 04:00:31 np0005548916 kernel: XFS (vda1): Mounting V5 Filesystem fcf6b761-831a-48a7-9f5f-068b5063763f
Dec  6 04:00:31 np0005548916 kernel: XFS (vda1): Ending clean mount
Dec  6 04:00:31 np0005548916 systemd[1]: Mounted /sysroot.
Dec  6 04:00:31 np0005548916 systemd[1]: Reached target Initrd Root File System.
Dec  6 04:00:31 np0005548916 systemd[1]: Starting Mountpoints Configured in the Real Root...
Dec  6 04:00:31 np0005548916 systemd[1]: initrd-parse-etc.service: Deactivated successfully.
Dec  6 04:00:31 np0005548916 systemd[1]: Finished Mountpoints Configured in the Real Root.
Dec  6 04:00:31 np0005548916 systemd[1]: Reached target Initrd File Systems.
Dec  6 04:00:31 np0005548916 systemd[1]: Reached target Initrd Default Target.
Dec  6 04:00:31 np0005548916 systemd[1]: Starting dracut mount hook...
Dec  6 04:00:31 np0005548916 systemd[1]: Finished dracut mount hook.
Dec  6 04:00:31 np0005548916 systemd[1]: Starting dracut pre-pivot and cleanup hook...
Dec  6 04:00:32 np0005548916 rpc.idmapd[448]: exiting on signal 15
Dec  6 04:00:32 np0005548916 systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully.
Dec  6 04:00:32 np0005548916 systemd[1]: Finished dracut pre-pivot and cleanup hook.
Dec  6 04:00:32 np0005548916 systemd[1]: Starting Cleaning Up and Shutting Down Daemons...
Dec  6 04:00:32 np0005548916 systemd[1]: Stopped target Network.
Dec  6 04:00:32 np0005548916 systemd[1]: Stopped target Remote Encrypted Volumes.
Dec  6 04:00:32 np0005548916 systemd[1]: Stopped target Timer Units.
Dec  6 04:00:32 np0005548916 systemd[1]: dbus.socket: Deactivated successfully.
Dec  6 04:00:32 np0005548916 systemd[1]: Closed D-Bus System Message Bus Socket.
Dec  6 04:00:32 np0005548916 systemd[1]: dracut-pre-pivot.service: Deactivated successfully.
Dec  6 04:00:32 np0005548916 systemd[1]: Stopped dracut pre-pivot and cleanup hook.
Dec  6 04:00:32 np0005548916 systemd[1]: Stopped target Initrd Default Target.
Dec  6 04:00:32 np0005548916 systemd[1]: Stopped target Basic System.
Dec  6 04:00:32 np0005548916 systemd[1]: Stopped target Initrd Root Device.
Dec  6 04:00:32 np0005548916 systemd[1]: Stopped target Initrd /usr File System.
Dec  6 04:00:32 np0005548916 systemd[1]: Stopped target Path Units.
Dec  6 04:00:32 np0005548916 systemd[1]: Stopped target Remote File Systems.
Dec  6 04:00:32 np0005548916 systemd[1]: Stopped target Preparation for Remote File Systems.
Dec  6 04:00:32 np0005548916 systemd[1]: Stopped target Slice Units.
Dec  6 04:00:32 np0005548916 systemd[1]: Stopped target Socket Units.
Dec  6 04:00:32 np0005548916 systemd[1]: Stopped target System Initialization.
Dec  6 04:00:32 np0005548916 systemd[1]: Stopped target Local File Systems.
Dec  6 04:00:32 np0005548916 systemd[1]: Stopped target Swaps.
Dec  6 04:00:32 np0005548916 systemd[1]: dracut-mount.service: Deactivated successfully.
Dec  6 04:00:32 np0005548916 systemd[1]: Stopped dracut mount hook.
Dec  6 04:00:32 np0005548916 systemd[1]: dracut-pre-mount.service: Deactivated successfully.
Dec  6 04:00:32 np0005548916 systemd[1]: Stopped dracut pre-mount hook.
Dec  6 04:00:32 np0005548916 systemd[1]: Stopped target Local Encrypted Volumes.
Dec  6 04:00:32 np0005548916 systemd[1]: systemd-ask-password-console.path: Deactivated successfully.
Dec  6 04:00:32 np0005548916 systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch.
Dec  6 04:00:32 np0005548916 systemd[1]: dracut-initqueue.service: Deactivated successfully.
Dec  6 04:00:32 np0005548916 systemd[1]: Stopped dracut initqueue hook.
Dec  6 04:00:32 np0005548916 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Dec  6 04:00:32 np0005548916 systemd[1]: Stopped Apply Kernel Variables.
Dec  6 04:00:32 np0005548916 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully.
Dec  6 04:00:32 np0005548916 systemd[1]: Stopped Create Volatile Files and Directories.
Dec  6 04:00:32 np0005548916 systemd[1]: systemd-udev-trigger.service: Deactivated successfully.
Dec  6 04:00:32 np0005548916 systemd[1]: Stopped Coldplug All udev Devices.
Dec  6 04:00:32 np0005548916 systemd[1]: dracut-pre-trigger.service: Deactivated successfully.
Dec  6 04:00:32 np0005548916 systemd[1]: Stopped dracut pre-trigger hook.
Dec  6 04:00:32 np0005548916 systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Dec  6 04:00:32 np0005548916 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Dec  6 04:00:32 np0005548916 systemd[1]: Stopped Setup Virtual Console.
Dec  6 04:00:32 np0005548916 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully.
Dec  6 04:00:32 np0005548916 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Dec  6 04:00:32 np0005548916 systemd[1]: systemd-udevd.service: Deactivated successfully.
Dec  6 04:00:32 np0005548916 systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Dec  6 04:00:32 np0005548916 systemd[1]: systemd-udevd.service: Consumed 1.082s CPU time.
Dec  6 04:00:32 np0005548916 systemd[1]: initrd-cleanup.service: Deactivated successfully.
Dec  6 04:00:32 np0005548916 systemd[1]: Finished Cleaning Up and Shutting Down Daemons.
Dec  6 04:00:32 np0005548916 systemd[1]: systemd-udevd-control.socket: Deactivated successfully.
Dec  6 04:00:32 np0005548916 systemd[1]: Closed udev Control Socket.
Dec  6 04:00:32 np0005548916 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully.
Dec  6 04:00:32 np0005548916 systemd[1]: Closed udev Kernel Socket.
Dec  6 04:00:32 np0005548916 systemd[1]: dracut-pre-udev.service: Deactivated successfully.
Dec  6 04:00:32 np0005548916 systemd[1]: Stopped dracut pre-udev hook.
Dec  6 04:00:32 np0005548916 systemd[1]: dracut-cmdline.service: Deactivated successfully.
Dec  6 04:00:32 np0005548916 systemd[1]: Stopped dracut cmdline hook.
Dec  6 04:00:32 np0005548916 systemd[1]: Starting Cleanup udev Database...
Dec  6 04:00:32 np0005548916 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully.
Dec  6 04:00:32 np0005548916 systemd[1]: Stopped Create Static Device Nodes in /dev.
Dec  6 04:00:32 np0005548916 systemd[1]: kmod-static-nodes.service: Deactivated successfully.
Dec  6 04:00:32 np0005548916 systemd[1]: Stopped Create List of Static Device Nodes.
Dec  6 04:00:32 np0005548916 systemd[1]: systemd-sysusers.service: Deactivated successfully.
Dec  6 04:00:32 np0005548916 systemd[1]: Stopped Create System Users.
Dec  6 04:00:32 np0005548916 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully.
Dec  6 04:00:32 np0005548916 systemd[1]: run-credentials-systemd\x2dsysusers.service.mount: Deactivated successfully.
Dec  6 04:00:32 np0005548916 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully.
Dec  6 04:00:32 np0005548916 systemd[1]: Finished Cleanup udev Database.
Dec  6 04:00:32 np0005548916 systemd[1]: Reached target Switch Root.
Dec  6 04:00:32 np0005548916 systemd[1]: Starting Switch Root...
Dec  6 04:00:32 np0005548916 systemd[1]: Switching root.
Dec  6 04:00:32 np0005548916 systemd-journald[309]: Journal stopped
Dec  6 04:00:32 np0005548916 systemd-journald: Received SIGTERM from PID 1 (systemd).
Dec  6 04:00:32 np0005548916 kernel: audit: type=1404 audit(1765011632.287:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1
Dec  6 04:00:32 np0005548916 kernel: SELinux:  policy capability network_peer_controls=1
Dec  6 04:00:32 np0005548916 kernel: SELinux:  policy capability open_perms=1
Dec  6 04:00:32 np0005548916 kernel: SELinux:  policy capability extended_socket_class=1
Dec  6 04:00:32 np0005548916 kernel: SELinux:  policy capability always_check_network=0
Dec  6 04:00:32 np0005548916 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec  6 04:00:32 np0005548916 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec  6 04:00:32 np0005548916 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec  6 04:00:32 np0005548916 kernel: audit: type=1403 audit(1765011632.420:3): auid=4294967295 ses=4294967295 lsm=selinux res=1
Dec  6 04:00:32 np0005548916 systemd: Successfully loaded SELinux policy in 136.180ms.
Dec  6 04:00:32 np0005548916 systemd: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 25.632ms.
Dec  6 04:00:32 np0005548916 systemd: systemd 252-59.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Dec  6 04:00:32 np0005548916 systemd: Detected virtualization kvm.
Dec  6 04:00:32 np0005548916 systemd: Detected architecture x86-64.
Dec  6 04:00:32 np0005548916 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:00:32 np0005548916 systemd: initrd-switch-root.service: Deactivated successfully.
Dec  6 04:00:32 np0005548916 systemd: Stopped Switch Root.
Dec  6 04:00:32 np0005548916 systemd: systemd-journald.service: Scheduled restart job, restart counter is at 1.
Dec  6 04:00:32 np0005548916 systemd: Created slice Slice /system/getty.
Dec  6 04:00:32 np0005548916 systemd: Created slice Slice /system/serial-getty.
Dec  6 04:00:32 np0005548916 systemd: Created slice Slice /system/sshd-keygen.
Dec  6 04:00:32 np0005548916 systemd: Created slice User and Session Slice.
Dec  6 04:00:32 np0005548916 systemd: Started Dispatch Password Requests to Console Directory Watch.
Dec  6 04:00:32 np0005548916 systemd: Started Forward Password Requests to Wall Directory Watch.
Dec  6 04:00:32 np0005548916 systemd: Set up automount Arbitrary Executable File Formats File System Automount Point.
Dec  6 04:00:32 np0005548916 systemd: Reached target Local Encrypted Volumes.
Dec  6 04:00:32 np0005548916 systemd: Stopped target Switch Root.
Dec  6 04:00:32 np0005548916 systemd: Stopped target Initrd File Systems.
Dec  6 04:00:32 np0005548916 systemd: Stopped target Initrd Root File System.
Dec  6 04:00:32 np0005548916 systemd: Reached target Local Integrity Protected Volumes.
Dec  6 04:00:32 np0005548916 systemd: Reached target Path Units.
Dec  6 04:00:32 np0005548916 systemd: Reached target rpc_pipefs.target.
Dec  6 04:00:32 np0005548916 systemd: Reached target Slice Units.
Dec  6 04:00:32 np0005548916 systemd: Reached target Swaps.
Dec  6 04:00:32 np0005548916 systemd: Reached target Local Verity Protected Volumes.
Dec  6 04:00:32 np0005548916 systemd: Listening on RPCbind Server Activation Socket.
Dec  6 04:00:32 np0005548916 systemd: Reached target RPC Port Mapper.
Dec  6 04:00:32 np0005548916 systemd: Listening on Process Core Dump Socket.
Dec  6 04:00:32 np0005548916 systemd: Listening on initctl Compatibility Named Pipe.
Dec  6 04:00:32 np0005548916 systemd: Listening on udev Control Socket.
Dec  6 04:00:32 np0005548916 systemd: Listening on udev Kernel Socket.
Dec  6 04:00:32 np0005548916 systemd: Mounting Huge Pages File System...
Dec  6 04:00:32 np0005548916 systemd: Mounting POSIX Message Queue File System...
Dec  6 04:00:32 np0005548916 systemd: Mounting Kernel Debug File System...
Dec  6 04:00:32 np0005548916 systemd: Mounting Kernel Trace File System...
Dec  6 04:00:32 np0005548916 systemd: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Dec  6 04:00:32 np0005548916 systemd: Starting Create List of Static Device Nodes...
Dec  6 04:00:32 np0005548916 systemd: Starting Load Kernel Module configfs...
Dec  6 04:00:32 np0005548916 systemd: Starting Load Kernel Module drm...
Dec  6 04:00:32 np0005548916 systemd: Starting Load Kernel Module efi_pstore...
Dec  6 04:00:32 np0005548916 systemd: Starting Load Kernel Module fuse...
Dec  6 04:00:32 np0005548916 systemd: Starting Read and set NIS domainname from /etc/sysconfig/network...
Dec  6 04:00:32 np0005548916 systemd: systemd-fsck-root.service: Deactivated successfully.
Dec  6 04:00:32 np0005548916 systemd: Stopped File System Check on Root Device.
Dec  6 04:00:32 np0005548916 systemd: Stopped Journal Service.
Dec  6 04:00:32 np0005548916 kernel: fuse: init (API version 7.37)
Dec  6 04:00:32 np0005548916 systemd: Starting Journal Service...
Dec  6 04:00:32 np0005548916 systemd: Load Kernel Modules was skipped because no trigger condition checks were met.
Dec  6 04:00:32 np0005548916 systemd: Starting Generate network units from Kernel command line...
Dec  6 04:00:32 np0005548916 systemd: TPM2 PCR Machine ID Measurement was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Dec  6 04:00:32 np0005548916 systemd: Starting Remount Root and Kernel File Systems...
Dec  6 04:00:32 np0005548916 systemd: Repartition Root Disk was skipped because no trigger condition checks were met.
Dec  6 04:00:32 np0005548916 systemd: Starting Apply Kernel Variables...
Dec  6 04:00:32 np0005548916 systemd: Starting Coldplug All udev Devices...
Dec  6 04:00:32 np0005548916 systemd: Mounted Huge Pages File System.
Dec  6 04:00:32 np0005548916 kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff)
Dec  6 04:00:32 np0005548916 systemd: Mounted POSIX Message Queue File System.
Dec  6 04:00:32 np0005548916 systemd: Mounted Kernel Debug File System.
Dec  6 04:00:32 np0005548916 systemd: Mounted Kernel Trace File System.
Dec  6 04:00:32 np0005548916 systemd-journald[680]: Journal started
Dec  6 04:00:32 np0005548916 systemd-journald[680]: Runtime Journal (/run/log/journal/4d4ef2323cc3337bbfd9081b2a323b4e) is 8.0M, max 153.6M, 145.6M free.
Dec  6 04:00:32 np0005548916 systemd[1]: Queued start job for default target Multi-User System.
Dec  6 04:00:32 np0005548916 systemd[1]: systemd-journald.service: Deactivated successfully.
Dec  6 04:00:32 np0005548916 systemd: Started Journal Service.
Dec  6 04:00:32 np0005548916 systemd[1]: Finished Create List of Static Device Nodes.
Dec  6 04:00:32 np0005548916 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Dec  6 04:00:32 np0005548916 systemd[1]: Finished Load Kernel Module configfs.
Dec  6 04:00:32 np0005548916 kernel: ACPI: bus type drm_connector registered
Dec  6 04:00:32 np0005548916 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully.
Dec  6 04:00:32 np0005548916 systemd[1]: Finished Load Kernel Module efi_pstore.
Dec  6 04:00:32 np0005548916 systemd[1]: modprobe@fuse.service: Deactivated successfully.
Dec  6 04:00:32 np0005548916 systemd[1]: Finished Load Kernel Module fuse.
Dec  6 04:00:32 np0005548916 systemd[1]: modprobe@drm.service: Deactivated successfully.
Dec  6 04:00:32 np0005548916 systemd[1]: Finished Load Kernel Module drm.
Dec  6 04:00:32 np0005548916 systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network.
Dec  6 04:00:32 np0005548916 systemd[1]: Finished Generate network units from Kernel command line.
Dec  6 04:00:32 np0005548916 systemd[1]: Finished Remount Root and Kernel File Systems.
Dec  6 04:00:32 np0005548916 systemd[1]: Finished Apply Kernel Variables.
Dec  6 04:00:32 np0005548916 systemd[1]: Mounting FUSE Control File System...
Dec  6 04:00:32 np0005548916 systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Dec  6 04:00:32 np0005548916 systemd[1]: Starting Rebuild Hardware Database...
Dec  6 04:00:32 np0005548916 systemd[1]: Starting Flush Journal to Persistent Storage...
Dec  6 04:00:32 np0005548916 systemd[1]: Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore).
Dec  6 04:00:32 np0005548916 systemd[1]: Starting Load/Save OS Random Seed...
Dec  6 04:00:32 np0005548916 systemd[1]: Starting Create System Users...
Dec  6 04:00:32 np0005548916 systemd[1]: Mounted FUSE Control File System.
Dec  6 04:00:32 np0005548916 systemd-journald[680]: Runtime Journal (/run/log/journal/4d4ef2323cc3337bbfd9081b2a323b4e) is 8.0M, max 153.6M, 145.6M free.
Dec  6 04:00:32 np0005548916 systemd-journald[680]: Received client request to flush runtime journal.
Dec  6 04:00:32 np0005548916 systemd[1]: Finished Flush Journal to Persistent Storage.
Dec  6 04:00:32 np0005548916 systemd[1]: Finished Load/Save OS Random Seed.
Dec  6 04:00:32 np0005548916 systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Dec  6 04:00:33 np0005548916 systemd[1]: Finished Create System Users.
Dec  6 04:00:33 np0005548916 systemd[1]: Starting Create Static Device Nodes in /dev...
Dec  6 04:00:33 np0005548916 systemd[1]: Finished Coldplug All udev Devices.
Dec  6 04:00:33 np0005548916 systemd[1]: Finished Create Static Device Nodes in /dev.
Dec  6 04:00:33 np0005548916 systemd[1]: Reached target Preparation for Local File Systems.
Dec  6 04:00:33 np0005548916 systemd[1]: Reached target Local File Systems.
Dec  6 04:00:33 np0005548916 systemd[1]: Starting Rebuild Dynamic Linker Cache...
Dec  6 04:00:33 np0005548916 systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux).
Dec  6 04:00:33 np0005548916 systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Dec  6 04:00:33 np0005548916 systemd[1]: Update Boot Loader Random Seed was skipped because no trigger condition checks were met.
Dec  6 04:00:33 np0005548916 systemd[1]: Starting Automatic Boot Loader Update...
Dec  6 04:00:33 np0005548916 systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id).
Dec  6 04:00:33 np0005548916 systemd[1]: Starting Create Volatile Files and Directories...
Dec  6 04:00:33 np0005548916 bootctl[697]: Couldn't find EFI system partition, skipping.
Dec  6 04:00:33 np0005548916 systemd[1]: Finished Automatic Boot Loader Update.
Dec  6 04:00:33 np0005548916 systemd[1]: Finished Create Volatile Files and Directories.
Dec  6 04:00:33 np0005548916 systemd[1]: Starting Security Auditing Service...
Dec  6 04:00:33 np0005548916 systemd[1]: Starting RPC Bind...
Dec  6 04:00:33 np0005548916 systemd[1]: Starting Rebuild Journal Catalog...
Dec  6 04:00:33 np0005548916 auditd[703]: audit dispatcher initialized with q_depth=2000 and 1 active plugins
Dec  6 04:00:33 np0005548916 auditd[703]: Init complete, auditd 3.1.5 listening for events (startup state enable)
Dec  6 04:00:33 np0005548916 systemd[1]: Finished Rebuild Journal Catalog.
Dec  6 04:00:33 np0005548916 systemd[1]: Started RPC Bind.
Dec  6 04:00:33 np0005548916 augenrules[708]: /sbin/augenrules: No change
Dec  6 04:00:33 np0005548916 augenrules[723]: No rules
Dec  6 04:00:33 np0005548916 augenrules[723]: enabled 1
Dec  6 04:00:33 np0005548916 augenrules[723]: failure 1
Dec  6 04:00:33 np0005548916 augenrules[723]: pid 703
Dec  6 04:00:33 np0005548916 augenrules[723]: rate_limit 0
Dec  6 04:00:33 np0005548916 augenrules[723]: backlog_limit 8192
Dec  6 04:00:33 np0005548916 augenrules[723]: lost 0
Dec  6 04:00:33 np0005548916 augenrules[723]: backlog 3
Dec  6 04:00:33 np0005548916 augenrules[723]: backlog_wait_time 60000
Dec  6 04:00:33 np0005548916 augenrules[723]: backlog_wait_time_actual 0
Dec  6 04:00:33 np0005548916 augenrules[723]: enabled 1
Dec  6 04:00:33 np0005548916 augenrules[723]: failure 1
Dec  6 04:00:33 np0005548916 augenrules[723]: pid 703
Dec  6 04:00:33 np0005548916 augenrules[723]: rate_limit 0
Dec  6 04:00:33 np0005548916 augenrules[723]: backlog_limit 8192
Dec  6 04:00:33 np0005548916 augenrules[723]: lost 0
Dec  6 04:00:33 np0005548916 augenrules[723]: backlog 2
Dec  6 04:00:33 np0005548916 augenrules[723]: backlog_wait_time 60000
Dec  6 04:00:33 np0005548916 augenrules[723]: backlog_wait_time_actual 0
Dec  6 04:00:33 np0005548916 augenrules[723]: enabled 1
Dec  6 04:00:33 np0005548916 augenrules[723]: failure 1
Dec  6 04:00:33 np0005548916 augenrules[723]: pid 703
Dec  6 04:00:33 np0005548916 augenrules[723]: rate_limit 0
Dec  6 04:00:33 np0005548916 augenrules[723]: backlog_limit 8192
Dec  6 04:00:33 np0005548916 augenrules[723]: lost 0
Dec  6 04:00:33 np0005548916 augenrules[723]: backlog 2
Dec  6 04:00:33 np0005548916 augenrules[723]: backlog_wait_time 60000
Dec  6 04:00:33 np0005548916 augenrules[723]: backlog_wait_time_actual 0
Dec  6 04:00:33 np0005548916 systemd[1]: Started Security Auditing Service.
Dec  6 04:00:33 np0005548916 systemd[1]: Starting Record System Boot/Shutdown in UTMP...
Dec  6 04:00:33 np0005548916 systemd[1]: Finished Rebuild Dynamic Linker Cache.
Dec  6 04:00:33 np0005548916 systemd[1]: Finished Record System Boot/Shutdown in UTMP.
Dec  6 04:00:33 np0005548916 systemd[1]: Finished Rebuild Hardware Database.
Dec  6 04:00:33 np0005548916 systemd[1]: Starting Rule-based Manager for Device Events and Files...
Dec  6 04:00:33 np0005548916 systemd[1]: Starting Update is Completed...
Dec  6 04:00:33 np0005548916 systemd[1]: Finished Update is Completed.
Dec  6 04:00:33 np0005548916 systemd-udevd[731]: Using default interface naming scheme 'rhel-9.0'.
Dec  6 04:00:33 np0005548916 systemd[1]: Started Rule-based Manager for Device Events and Files.
Dec  6 04:00:33 np0005548916 systemd[1]: Reached target System Initialization.
Dec  6 04:00:33 np0005548916 systemd[1]: Started dnf makecache --timer.
Dec  6 04:00:33 np0005548916 systemd[1]: Started Daily rotation of log files.
Dec  6 04:00:33 np0005548916 systemd[1]: Started Daily Cleanup of Temporary Directories.
Dec  6 04:00:33 np0005548916 systemd[1]: Reached target Timer Units.
Dec  6 04:00:33 np0005548916 systemd[1]: Listening on D-Bus System Message Bus Socket.
Dec  6 04:00:33 np0005548916 systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket.
Dec  6 04:00:33 np0005548916 systemd[1]: Reached target Socket Units.
Dec  6 04:00:33 np0005548916 systemd-udevd[743]: Network interface NamePolicy= disabled on kernel command line.
Dec  6 04:00:33 np0005548916 systemd[1]: Starting D-Bus System Message Bus...
Dec  6 04:00:33 np0005548916 systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Dec  6 04:00:33 np0005548916 systemd[1]: Condition check resulted in /dev/ttyS0 being skipped.
Dec  6 04:00:33 np0005548916 systemd[1]: Starting Load Kernel Module configfs...
Dec  6 04:00:33 np0005548916 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Dec  6 04:00:33 np0005548916 systemd[1]: Finished Load Kernel Module configfs.
Dec  6 04:00:33 np0005548916 systemd[1]: Started D-Bus System Message Bus.
Dec  6 04:00:33 np0005548916 systemd[1]: Reached target Basic System.
Dec  6 04:00:33 np0005548916 dbus-broker-lau[770]: Ready
Dec  6 04:00:33 np0005548916 systemd[1]: Starting NTP client/server...
Dec  6 04:00:33 np0005548916 systemd[1]: Starting Cloud-init: Local Stage (pre-network)...
Dec  6 04:00:33 np0005548916 systemd[1]: Starting Restore /run/initramfs on shutdown...
Dec  6 04:00:33 np0005548916 systemd[1]: Starting IPv4 firewall with iptables...
Dec  6 04:00:33 np0005548916 kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0
Dec  6 04:00:33 np0005548916 kernel: i2c i2c-0: 1/1 memory slots populated (from DMI)
Dec  6 04:00:33 np0005548916 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD
Dec  6 04:00:33 np0005548916 systemd[1]: Started irqbalance daemon.
Dec  6 04:00:33 np0005548916 systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload).
Dec  6 04:00:33 np0005548916 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec  6 04:00:33 np0005548916 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec  6 04:00:33 np0005548916 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec  6 04:00:33 np0005548916 systemd[1]: Reached target sshd-keygen.target.
Dec  6 04:00:33 np0005548916 systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met.
Dec  6 04:00:33 np0005548916 systemd[1]: Reached target User and Group Name Lookups.
Dec  6 04:00:33 np0005548916 kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6
Dec  6 04:00:33 np0005548916 systemd[1]: Starting User Login Management...
Dec  6 04:00:33 np0005548916 systemd[1]: Finished Restore /run/initramfs on shutdown.
Dec  6 04:00:33 np0005548916 chronyd[796]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Dec  6 04:00:33 np0005548916 chronyd[796]: Loaded 0 symmetric keys
Dec  6 04:00:33 np0005548916 chronyd[796]: Using right/UTC timezone to obtain leap second data
Dec  6 04:00:33 np0005548916 chronyd[796]: Loaded seccomp filter (level 2)
Dec  6 04:00:33 np0005548916 kernel: [drm] pci: virtio-vga detected at 0000:00:02.0
Dec  6 04:00:33 np0005548916 kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console
Dec  6 04:00:33 np0005548916 systemd[1]: Started NTP client/server.
Dec  6 04:00:33 np0005548916 kernel: Console: switching to colour dummy device 80x25
Dec  6 04:00:33 np0005548916 kernel: [drm] features: -virgl +edid -resource_blob -host_visible
Dec  6 04:00:33 np0005548916 kernel: [drm] features: -context_init
Dec  6 04:00:33 np0005548916 kernel: [drm] number of scanouts: 1
Dec  6 04:00:33 np0005548916 kernel: [drm] number of cap sets: 0
Dec  6 04:00:33 np0005548916 kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled
Dec  6 04:00:33 np0005548916 kernel: Warning: Deprecated Driver is detected: nft_compat_module_init will not be maintained in a future major release and may be disabled
Dec  6 04:00:33 np0005548916 systemd-logind[788]: Watching system buttons on /dev/input/event0 (Power Button)
Dec  6 04:00:33 np0005548916 systemd-logind[788]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Dec  6 04:00:33 np0005548916 systemd-logind[788]: New seat seat0.
Dec  6 04:00:33 np0005548916 systemd[1]: Started User Login Management.
Dec  6 04:00:33 np0005548916 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:02.0 on minor 0
Dec  6 04:00:33 np0005548916 kernel: fbcon: virtio_gpudrmfb (fb0) is primary device
Dec  6 04:00:33 np0005548916 kernel: Console: switching to colour frame buffer device 128x48
Dec  6 04:00:33 np0005548916 kernel: virtio-pci 0000:00:02.0: [drm] fb0: virtio_gpudrmfb frame buffer device
Dec  6 04:00:34 np0005548916 kernel: kvm_amd: TSC scaling supported
Dec  6 04:00:34 np0005548916 kernel: kvm_amd: Nested Virtualization enabled
Dec  6 04:00:34 np0005548916 kernel: kvm_amd: Nested Paging enabled
Dec  6 04:00:34 np0005548916 kernel: kvm_amd: LBR virtualization supported
Dec  6 04:00:34 np0005548916 iptables.init[782]: iptables: Applying firewall rules: [  OK  ]
Dec  6 04:00:34 np0005548916 systemd[1]: Finished IPv4 firewall with iptables.
Dec  6 04:00:34 np0005548916 cloud-init[841]: Cloud-init v. 24.4-7.el9 running 'init-local' at Sat, 06 Dec 2025 09:00:34 +0000. Up 6.99 seconds.
Dec  6 04:00:34 np0005548916 systemd[1]: run-cloud\x2dinit-tmp-tmp5zgaxgc3.mount: Deactivated successfully.
Dec  6 04:00:34 np0005548916 systemd[1]: Starting Hostname Service...
Dec  6 04:00:34 np0005548916 systemd[1]: Started Hostname Service.
Dec  6 04:00:34 np0005548916 systemd-hostnamed[855]: Hostname set to <np0005548916.novalocal> (static)
Dec  6 04:00:34 np0005548916 systemd[1]: Finished Cloud-init: Local Stage (pre-network).
Dec  6 04:00:34 np0005548916 systemd[1]: Reached target Preparation for Network.
Dec  6 04:00:34 np0005548916 systemd[1]: Starting Network Manager...
Dec  6 04:00:34 np0005548916 NetworkManager[859]: <info>  [1765011634.9505] NetworkManager (version 1.54.1-1.el9) is starting... (boot:27715b31-3399-4bbf-a0fa-54836c80918e)
Dec  6 04:00:34 np0005548916 NetworkManager[859]: <info>  [1765011634.9512] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Dec  6 04:00:34 np0005548916 NetworkManager[859]: <info>  [1765011634.9600] manager[0x559845864080]: monitoring kernel firmware directory '/lib/firmware'.
Dec  6 04:00:34 np0005548916 NetworkManager[859]: <info>  [1765011634.9660] hostname: hostname: using hostnamed
Dec  6 04:00:34 np0005548916 NetworkManager[859]: <info>  [1765011634.9660] hostname: static hostname changed from (none) to "np0005548916.novalocal"
Dec  6 04:00:34 np0005548916 NetworkManager[859]: <info>  [1765011634.9665] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Dec  6 04:00:34 np0005548916 NetworkManager[859]: <info>  [1765011634.9797] manager[0x559845864080]: rfkill: Wi-Fi hardware radio set enabled
Dec  6 04:00:34 np0005548916 NetworkManager[859]: <info>  [1765011634.9798] manager[0x559845864080]: rfkill: WWAN hardware radio set enabled
Dec  6 04:00:34 np0005548916 NetworkManager[859]: <info>  [1765011634.9846] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Dec  6 04:00:34 np0005548916 NetworkManager[859]: <info>  [1765011634.9846] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Dec  6 04:00:34 np0005548916 NetworkManager[859]: <info>  [1765011634.9847] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Dec  6 04:00:34 np0005548916 NetworkManager[859]: <info>  [1765011634.9847] manager: Networking is enabled by state file
Dec  6 04:00:34 np0005548916 NetworkManager[859]: <info>  [1765011634.9850] settings: Loaded settings plugin: keyfile (internal)
Dec  6 04:00:34 np0005548916 NetworkManager[859]: <info>  [1765011634.9862] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Dec  6 04:00:34 np0005548916 NetworkManager[859]: <info>  [1765011634.9881] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Dec  6 04:00:34 np0005548916 NetworkManager[859]: <info>  [1765011634.9894] dhcp: init: Using DHCP client 'internal'
Dec  6 04:00:34 np0005548916 NetworkManager[859]: <info>  [1765011634.9897] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Dec  6 04:00:34 np0005548916 NetworkManager[859]: <info>  [1765011634.9913] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  6 04:00:34 np0005548916 systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch.
Dec  6 04:00:34 np0005548916 NetworkManager[859]: <info>  [1765011634.9920] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Dec  6 04:00:34 np0005548916 NetworkManager[859]: <info>  [1765011634.9929] device (lo): Activation: starting connection 'lo' (04d45710-56f6-4696-9924-dd30b84bf74f)
Dec  6 04:00:34 np0005548916 NetworkManager[859]: <info>  [1765011634.9940] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Dec  6 04:00:34 np0005548916 NetworkManager[859]: <info>  [1765011634.9943] device (eth0): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  6 04:00:34 np0005548916 NetworkManager[859]: <info>  [1765011634.9976] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Dec  6 04:00:34 np0005548916 NetworkManager[859]: <info>  [1765011634.9979] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Dec  6 04:00:34 np0005548916 NetworkManager[859]: <info>  [1765011634.9981] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Dec  6 04:00:34 np0005548916 NetworkManager[859]: <info>  [1765011634.9983] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Dec  6 04:00:34 np0005548916 NetworkManager[859]: <info>  [1765011634.9985] device (eth0): carrier: link connected
Dec  6 04:00:34 np0005548916 NetworkManager[859]: <info>  [1765011634.9987] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Dec  6 04:00:34 np0005548916 NetworkManager[859]: <info>  [1765011634.9993] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Dec  6 04:00:35 np0005548916 NetworkManager[859]: <info>  [1765011634.9999] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Dec  6 04:00:35 np0005548916 NetworkManager[859]: <info>  [1765011635.0003] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Dec  6 04:00:35 np0005548916 NetworkManager[859]: <info>  [1765011635.0004] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  6 04:00:35 np0005548916 NetworkManager[859]: <info>  [1765011635.0006] manager: NetworkManager state is now CONNECTING
Dec  6 04:00:35 np0005548916 NetworkManager[859]: <info>  [1765011635.0007] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  6 04:00:35 np0005548916 NetworkManager[859]: <info>  [1765011635.0013] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  6 04:00:35 np0005548916 NetworkManager[859]: <info>  [1765011635.0016] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec  6 04:00:35 np0005548916 NetworkManager[859]: <info>  [1765011635.0047] dhcp4 (eth0): state changed new lease, address=38.102.83.113
Dec  6 04:00:35 np0005548916 NetworkManager[859]: <info>  [1765011635.0054] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Dec  6 04:00:35 np0005548916 systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec  6 04:00:35 np0005548916 NetworkManager[859]: <info>  [1765011635.0076] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec  6 04:00:35 np0005548916 systemd[1]: Started Network Manager.
Dec  6 04:00:35 np0005548916 systemd[1]: Reached target Network.
Dec  6 04:00:35 np0005548916 systemd[1]: Starting Network Manager Wait Online...
Dec  6 04:00:35 np0005548916 systemd[1]: Starting GSSAPI Proxy Daemon...
Dec  6 04:00:35 np0005548916 systemd[1]: Started Network Manager Script Dispatcher Service.
Dec  6 04:00:35 np0005548916 NetworkManager[859]: <info>  [1765011635.0238] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Dec  6 04:00:35 np0005548916 NetworkManager[859]: <info>  [1765011635.0242] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Dec  6 04:00:35 np0005548916 NetworkManager[859]: <info>  [1765011635.0248] device (lo): Activation: successful, device activated.
Dec  6 04:00:35 np0005548916 NetworkManager[859]: <info>  [1765011635.0254] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec  6 04:00:35 np0005548916 NetworkManager[859]: <info>  [1765011635.0256] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec  6 04:00:35 np0005548916 NetworkManager[859]: <info>  [1765011635.0259] manager: NetworkManager state is now CONNECTED_SITE
Dec  6 04:00:35 np0005548916 NetworkManager[859]: <info>  [1765011635.0262] device (eth0): Activation: successful, device activated.
Dec  6 04:00:35 np0005548916 NetworkManager[859]: <info>  [1765011635.0268] manager: NetworkManager state is now CONNECTED_GLOBAL
Dec  6 04:00:35 np0005548916 NetworkManager[859]: <info>  [1765011635.0270] manager: startup complete
Dec  6 04:00:35 np0005548916 systemd[1]: Started GSSAPI Proxy Daemon.
Dec  6 04:00:35 np0005548916 systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Dec  6 04:00:35 np0005548916 systemd[1]: Reached target NFS client services.
Dec  6 04:00:35 np0005548916 systemd[1]: Reached target Preparation for Remote File Systems.
Dec  6 04:00:35 np0005548916 systemd[1]: Reached target Remote File Systems.
Dec  6 04:00:35 np0005548916 systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Dec  6 04:00:35 np0005548916 systemd[1]: Finished Network Manager Wait Online.
Dec  6 04:00:35 np0005548916 systemd[1]: Starting Cloud-init: Network Stage...
Dec  6 04:00:35 np0005548916 cloud-init[923]: Cloud-init v. 24.4-7.el9 running 'init' at Sat, 06 Dec 2025 09:00:35 +0000. Up 8.01 seconds.
Dec  6 04:00:35 np0005548916 cloud-init[923]: ci-info: +++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++
Dec  6 04:00:35 np0005548916 cloud-init[923]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Dec  6 04:00:35 np0005548916 cloud-init[923]: ci-info: | Device |  Up  |           Address            |      Mask     | Scope  |     Hw-Address    |
Dec  6 04:00:35 np0005548916 cloud-init[923]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Dec  6 04:00:35 np0005548916 cloud-init[923]: ci-info: |  eth0  | True |        38.102.83.113         | 255.255.255.0 | global | fa:16:3e:44:48:bb |
Dec  6 04:00:35 np0005548916 cloud-init[923]: ci-info: |  eth0  | True | fe80::f816:3eff:fe44:48bb/64 |       .       |  link  | fa:16:3e:44:48:bb |
Dec  6 04:00:35 np0005548916 cloud-init[923]: ci-info: |   lo   | True |          127.0.0.1           |   255.0.0.0   |  host  |         .         |
Dec  6 04:00:35 np0005548916 cloud-init[923]: ci-info: |   lo   | True |           ::1/128            |       .       |  host  |         .         |
Dec  6 04:00:35 np0005548916 cloud-init[923]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Dec  6 04:00:35 np0005548916 cloud-init[923]: ci-info: +++++++++++++++++++++++++++++++++Route IPv4 info+++++++++++++++++++++++++++++++++
Dec  6 04:00:35 np0005548916 cloud-init[923]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Dec  6 04:00:35 np0005548916 cloud-init[923]: ci-info: | Route |   Destination   |    Gateway    |     Genmask     | Interface | Flags |
Dec  6 04:00:35 np0005548916 cloud-init[923]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Dec  6 04:00:35 np0005548916 cloud-init[923]: ci-info: |   0   |     0.0.0.0     |  38.102.83.1  |     0.0.0.0     |    eth0   |   UG  |
Dec  6 04:00:35 np0005548916 cloud-init[923]: ci-info: |   1   |   38.102.83.0   |    0.0.0.0    |  255.255.255.0  |    eth0   |   U   |
Dec  6 04:00:35 np0005548916 cloud-init[923]: ci-info: |   2   | 169.254.169.254 | 38.102.83.126 | 255.255.255.255 |    eth0   |  UGH  |
Dec  6 04:00:35 np0005548916 cloud-init[923]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Dec  6 04:00:35 np0005548916 cloud-init[923]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++
Dec  6 04:00:35 np0005548916 cloud-init[923]: ci-info: +-------+-------------+---------+-----------+-------+
Dec  6 04:00:35 np0005548916 cloud-init[923]: ci-info: | Route | Destination | Gateway | Interface | Flags |
Dec  6 04:00:35 np0005548916 cloud-init[923]: ci-info: +-------+-------------+---------+-----------+-------+
Dec  6 04:00:35 np0005548916 cloud-init[923]: ci-info: |   1   |  fe80::/64  |    ::   |    eth0   |   U   |
Dec  6 04:00:35 np0005548916 cloud-init[923]: ci-info: |   3   |  multicast  |    ::   |    eth0   |   U   |
Dec  6 04:00:35 np0005548916 cloud-init[923]: ci-info: +-------+-------------+---------+-----------+-------+
Dec  6 04:00:36 np0005548916 cloud-init[923]: Generating public/private rsa key pair.
Dec  6 04:00:36 np0005548916 cloud-init[923]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key
Dec  6 04:00:36 np0005548916 cloud-init[923]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub
Dec  6 04:00:36 np0005548916 cloud-init[923]: The key fingerprint is:
Dec  6 04:00:36 np0005548916 cloud-init[923]: SHA256:aFdqn80bOkCfNh8DURqnA70R3wlMATbYgjvI8ZQBB8w root@np0005548916.novalocal
Dec  6 04:00:36 np0005548916 cloud-init[923]: The key's randomart image is:
Dec  6 04:00:36 np0005548916 cloud-init[923]: +---[RSA 3072]----+
Dec  6 04:00:36 np0005548916 cloud-init[923]: |   oooo+.+***.   |
Dec  6 04:00:36 np0005548916 cloud-init[923]: |    E.+ oo=B.o . |
Dec  6 04:00:36 np0005548916 cloud-init[923]: |   . = . .=+. o  |
Dec  6 04:00:36 np0005548916 cloud-init[923]: |    o +..oo.     |
Dec  6 04:00:36 np0005548916 cloud-init[923]: |      ooS. o     |
Dec  6 04:00:36 np0005548916 cloud-init[923]: |     . o..=+o    |
Dec  6 04:00:36 np0005548916 cloud-init[923]: |         ooo+o   |
Dec  6 04:00:36 np0005548916 cloud-init[923]: |          ...o   |
Dec  6 04:00:36 np0005548916 cloud-init[923]: |          ...    |
Dec  6 04:00:36 np0005548916 cloud-init[923]: +----[SHA256]-----+
Dec  6 04:00:36 np0005548916 cloud-init[923]: Generating public/private ecdsa key pair.
Dec  6 04:00:36 np0005548916 cloud-init[923]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key
Dec  6 04:00:36 np0005548916 cloud-init[923]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub
Dec  6 04:00:36 np0005548916 cloud-init[923]: The key fingerprint is:
Dec  6 04:00:36 np0005548916 cloud-init[923]: SHA256:/V9Nf6VbQ7CIW8plp6d6vVny2tXx2wtLLC0qPtvLCN0 root@np0005548916.novalocal
Dec  6 04:00:36 np0005548916 cloud-init[923]: The key's randomart image is:
Dec  6 04:00:36 np0005548916 cloud-init[923]: +---[ECDSA 256]---+
Dec  6 04:00:36 np0005548916 cloud-init[923]: |                 |
Dec  6 04:00:36 np0005548916 cloud-init[923]: |                 |
Dec  6 04:00:36 np0005548916 cloud-init[923]: |             .   |
Dec  6 04:00:36 np0005548916 cloud-init[923]: |         .. . o  |
Dec  6 04:00:36 np0005548916 cloud-init[923]: |        S..= o oo|
Dec  6 04:00:36 np0005548916 cloud-init[923]: |      . o *.= .oB|
Dec  6 04:00:36 np0005548916 cloud-init[923]: |     . . E +oB.oX|
Dec  6 04:00:36 np0005548916 cloud-init[923]: |      .o+ ..*oO+*|
Dec  6 04:00:36 np0005548916 cloud-init[923]: |      .++*+. ===o|
Dec  6 04:00:36 np0005548916 cloud-init[923]: +----[SHA256]-----+
Dec  6 04:00:36 np0005548916 cloud-init[923]: Generating public/private ed25519 key pair.
Dec  6 04:00:36 np0005548916 cloud-init[923]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key
Dec  6 04:00:36 np0005548916 cloud-init[923]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub
Dec  6 04:00:36 np0005548916 cloud-init[923]: The key fingerprint is:
Dec  6 04:00:36 np0005548916 cloud-init[923]: SHA256:UXMFl6H0o8BuOE4G7OkZodxjFsGSMf8vDaZdvN0lYLY root@np0005548916.novalocal
Dec  6 04:00:36 np0005548916 cloud-init[923]: The key's randomart image is:
Dec  6 04:00:36 np0005548916 cloud-init[923]: +--[ED25519 256]--+
Dec  6 04:00:36 np0005548916 cloud-init[923]: |    o+.   o +o+o |
Dec  6 04:00:36 np0005548916 cloud-init[923]: |    o+.. o + +.  |
Dec  6 04:00:36 np0005548916 cloud-init[923]: |     .* . o = o  |
Dec  6 04:00:36 np0005548916 cloud-init[923]: |   . + * = + + . |
Dec  6 04:00:36 np0005548916 cloud-init[923]: |    o O S = E . .|
Dec  6 04:00:36 np0005548916 cloud-init[923]: |     + @ B o . o |
Dec  6 04:00:36 np0005548916 cloud-init[923]: |      + + + . .  |
Dec  6 04:00:36 np0005548916 cloud-init[923]: |         .       |
Dec  6 04:00:36 np0005548916 cloud-init[923]: |                 |
Dec  6 04:00:36 np0005548916 cloud-init[923]: +----[SHA256]-----+
Dec  6 04:00:36 np0005548916 sm-notify[1006]: Version 2.5.4 starting
Dec  6 04:00:36 np0005548916 systemd[1]: Finished Cloud-init: Network Stage.
Dec  6 04:00:36 np0005548916 systemd[1]: Reached target Cloud-config availability.
Dec  6 04:00:36 np0005548916 systemd[1]: Reached target Network is Online.
Dec  6 04:00:36 np0005548916 systemd[1]: Starting Cloud-init: Config Stage...
Dec  6 04:00:36 np0005548916 systemd[1]: Starting Crash recovery kernel arming...
Dec  6 04:00:36 np0005548916 systemd[1]: Starting Notify NFS peers of a restart...
Dec  6 04:00:36 np0005548916 systemd[1]: Starting System Logging Service...
Dec  6 04:00:36 np0005548916 systemd[1]: Starting OpenSSH server daemon...
Dec  6 04:00:36 np0005548916 systemd[1]: Starting Permit User Sessions...
Dec  6 04:00:36 np0005548916 systemd[1]: Started Notify NFS peers of a restart.
Dec  6 04:00:36 np0005548916 systemd[1]: Started OpenSSH server daemon.
Dec  6 04:00:36 np0005548916 systemd[1]: Finished Permit User Sessions.
Dec  6 04:00:36 np0005548916 systemd[1]: Started Command Scheduler.
Dec  6 04:00:36 np0005548916 systemd[1]: Started Getty on tty1.
Dec  6 04:00:36 np0005548916 systemd[1]: Started Serial Getty on ttyS0.
Dec  6 04:00:36 np0005548916 systemd[1]: Reached target Login Prompts.
Dec  6 04:00:36 np0005548916 rsyslogd[1007]: [origin software="rsyslogd" swVersion="8.2510.0-2.el9" x-pid="1007" x-info="https://www.rsyslog.com"] start
Dec  6 04:00:36 np0005548916 rsyslogd[1007]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2040 ]
Dec  6 04:00:36 np0005548916 systemd[1]: Started System Logging Service.
Dec  6 04:00:36 np0005548916 systemd[1]: Reached target Multi-User System.
Dec  6 04:00:36 np0005548916 systemd[1]: Starting Record Runlevel Change in UTMP...
Dec  6 04:00:36 np0005548916 systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully.
Dec  6 04:00:36 np0005548916 systemd[1]: Finished Record Runlevel Change in UTMP.
Dec  6 04:00:36 np0005548916 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec  6 04:00:36 np0005548916 kdumpctl[1016]: kdump: No kdump initial ramdisk found.
Dec  6 04:00:36 np0005548916 kdumpctl[1016]: kdump: Rebuilding /boot/initramfs-5.14.0-645.el9.x86_64kdump.img
Dec  6 04:00:36 np0005548916 cloud-init[1141]: Cloud-init v. 24.4-7.el9 running 'modules:config' at Sat, 06 Dec 2025 09:00:36 +0000. Up 9.58 seconds.
Dec  6 04:00:37 np0005548916 systemd[1]: Finished Cloud-init: Config Stage.
Dec  6 04:00:37 np0005548916 systemd[1]: Starting Cloud-init: Final Stage...
Dec  6 04:00:37 np0005548916 dracut[1286]: dracut-057-102.git20250818.el9
Dec  6 04:00:37 np0005548916 cloud-init[1304]: Cloud-init v. 24.4-7.el9 running 'modules:final' at Sat, 06 Dec 2025 09:00:37 +0000. Up 9.99 seconds.
Dec  6 04:00:37 np0005548916 cloud-init[1311]: #############################################################
Dec  6 04:00:37 np0005548916 cloud-init[1316]: -----BEGIN SSH HOST KEY FINGERPRINTS-----
Dec  6 04:00:37 np0005548916 dracut[1288]: Executing: /usr/bin/dracut --quiet --hostonly --hostonly-cmdline --hostonly-i18n --hostonly-mode strict --hostonly-nics  --mount "/dev/disk/by-uuid/fcf6b761-831a-48a7-9f5f-068b5063763f /sysroot xfs rw,relatime,seclabel,attr2,inode64,logbufs=8,logbsize=32k,noquota" --squash-compressor zstd --no-hostonly-default-device --add-confdir /lib/kdump/dracut.conf.d -f /boot/initramfs-5.14.0-645.el9.x86_64kdump.img 5.14.0-645.el9.x86_64
Dec  6 04:00:37 np0005548916 cloud-init[1322]: 256 SHA256:/V9Nf6VbQ7CIW8plp6d6vVny2tXx2wtLLC0qPtvLCN0 root@np0005548916.novalocal (ECDSA)
Dec  6 04:00:37 np0005548916 cloud-init[1328]: 256 SHA256:UXMFl6H0o8BuOE4G7OkZodxjFsGSMf8vDaZdvN0lYLY root@np0005548916.novalocal (ED25519)
Dec  6 04:00:37 np0005548916 cloud-init[1334]: 3072 SHA256:aFdqn80bOkCfNh8DURqnA70R3wlMATbYgjvI8ZQBB8w root@np0005548916.novalocal (RSA)
Dec  6 04:00:37 np0005548916 cloud-init[1336]: -----END SSH HOST KEY FINGERPRINTS-----
Dec  6 04:00:37 np0005548916 cloud-init[1338]: #############################################################
Dec  6 04:00:37 np0005548916 cloud-init[1304]: Cloud-init v. 24.4-7.el9 finished at Sat, 06 Dec 2025 09:00:37 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0].  Up 10.17 seconds
Dec  6 04:00:37 np0005548916 systemd[1]: Finished Cloud-init: Final Stage.
Dec  6 04:00:37 np0005548916 systemd[1]: Reached target Cloud-init target.
Dec  6 04:00:37 np0005548916 dracut[1288]: dracut module 'systemd-networkd' will not be installed, because command 'networkctl' could not be found!
Dec  6 04:00:37 np0005548916 dracut[1288]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd' could not be found!
Dec  6 04:00:37 np0005548916 dracut[1288]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd-wait-online' could not be found!
Dec  6 04:00:37 np0005548916 dracut[1288]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Dec  6 04:00:37 np0005548916 dracut[1288]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Dec  6 04:00:37 np0005548916 dracut[1288]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Dec  6 04:00:37 np0005548916 dracut[1288]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Dec  6 04:00:38 np0005548916 dracut[1288]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Dec  6 04:00:38 np0005548916 dracut[1288]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Dec  6 04:00:38 np0005548916 dracut[1288]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Dec  6 04:00:38 np0005548916 dracut[1288]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Dec  6 04:00:38 np0005548916 dracut[1288]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Dec  6 04:00:38 np0005548916 dracut[1288]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Dec  6 04:00:38 np0005548916 dracut[1288]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Dec  6 04:00:38 np0005548916 dracut[1288]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Dec  6 04:00:38 np0005548916 dracut[1288]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Dec  6 04:00:38 np0005548916 dracut[1288]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Dec  6 04:00:38 np0005548916 dracut[1288]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Dec  6 04:00:38 np0005548916 dracut[1288]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Dec  6 04:00:38 np0005548916 dracut[1288]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Dec  6 04:00:38 np0005548916 dracut[1288]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Dec  6 04:00:38 np0005548916 dracut[1288]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Dec  6 04:00:38 np0005548916 dracut[1288]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Dec  6 04:00:38 np0005548916 dracut[1288]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Dec  6 04:00:38 np0005548916 dracut[1288]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Dec  6 04:00:38 np0005548916 dracut[1288]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Dec  6 04:00:38 np0005548916 dracut[1288]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Dec  6 04:00:38 np0005548916 dracut[1288]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Dec  6 04:00:38 np0005548916 dracut[1288]: dracut module 'biosdevname' will not be installed, because command 'biosdevname' could not be found!
Dec  6 04:00:38 np0005548916 dracut[1288]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Dec  6 04:00:38 np0005548916 dracut[1288]: memstrack is not available
Dec  6 04:00:38 np0005548916 dracut[1288]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Dec  6 04:00:38 np0005548916 dracut[1288]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Dec  6 04:00:38 np0005548916 dracut[1288]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Dec  6 04:00:38 np0005548916 dracut[1288]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Dec  6 04:00:38 np0005548916 dracut[1288]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Dec  6 04:00:38 np0005548916 dracut[1288]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Dec  6 04:00:38 np0005548916 dracut[1288]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Dec  6 04:00:38 np0005548916 dracut[1288]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Dec  6 04:00:38 np0005548916 dracut[1288]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Dec  6 04:00:38 np0005548916 dracut[1288]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Dec  6 04:00:38 np0005548916 dracut[1288]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Dec  6 04:00:38 np0005548916 dracut[1288]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Dec  6 04:00:38 np0005548916 dracut[1288]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Dec  6 04:00:38 np0005548916 dracut[1288]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Dec  6 04:00:38 np0005548916 dracut[1288]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Dec  6 04:00:38 np0005548916 dracut[1288]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Dec  6 04:00:38 np0005548916 dracut[1288]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Dec  6 04:00:38 np0005548916 dracut[1288]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Dec  6 04:00:38 np0005548916 dracut[1288]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Dec  6 04:00:38 np0005548916 dracut[1288]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Dec  6 04:00:38 np0005548916 dracut[1288]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Dec  6 04:00:38 np0005548916 dracut[1288]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Dec  6 04:00:38 np0005548916 dracut[1288]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Dec  6 04:00:38 np0005548916 dracut[1288]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Dec  6 04:00:38 np0005548916 dracut[1288]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Dec  6 04:00:38 np0005548916 dracut[1288]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Dec  6 04:00:38 np0005548916 dracut[1288]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Dec  6 04:00:38 np0005548916 dracut[1288]: memstrack is not available
Dec  6 04:00:38 np0005548916 dracut[1288]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Dec  6 04:00:39 np0005548916 dracut[1288]: *** Including module: systemd ***
Dec  6 04:00:39 np0005548916 dracut[1288]: *** Including module: fips ***
Dec  6 04:00:39 np0005548916 dracut[1288]: *** Including module: systemd-initrd ***
Dec  6 04:00:39 np0005548916 dracut[1288]: *** Including module: i18n ***
Dec  6 04:00:39 np0005548916 dracut[1288]: *** Including module: drm ***
Dec  6 04:00:40 np0005548916 chronyd[796]: Selected source 174.142.148.226 (2.centos.pool.ntp.org)
Dec  6 04:00:40 np0005548916 chronyd[796]: System clock TAI offset set to 37 seconds
Dec  6 04:00:40 np0005548916 dracut[1288]: *** Including module: prefixdevname ***
Dec  6 04:00:40 np0005548916 dracut[1288]: *** Including module: kernel-modules ***
Dec  6 04:00:40 np0005548916 kernel: block vda: the capability attribute has been deprecated.
Dec  6 04:00:40 np0005548916 dracut[1288]: *** Including module: kernel-modules-extra ***
Dec  6 04:00:40 np0005548916 dracut[1288]: *** Including module: qemu ***
Dec  6 04:00:40 np0005548916 dracut[1288]: *** Including module: fstab-sys ***
Dec  6 04:00:40 np0005548916 dracut[1288]: *** Including module: rootfs-block ***
Dec  6 04:00:40 np0005548916 dracut[1288]: *** Including module: terminfo ***
Dec  6 04:00:40 np0005548916 dracut[1288]: *** Including module: udev-rules ***
Dec  6 04:00:41 np0005548916 dracut[1288]: Skipping udev rule: 91-permissions.rules
Dec  6 04:00:41 np0005548916 dracut[1288]: Skipping udev rule: 80-drivers-modprobe.rules
Dec  6 04:00:41 np0005548916 dracut[1288]: *** Including module: virtiofs ***
Dec  6 04:00:41 np0005548916 dracut[1288]: *** Including module: dracut-systemd ***
Dec  6 04:00:41 np0005548916 dracut[1288]: *** Including module: usrmount ***
Dec  6 04:00:41 np0005548916 dracut[1288]: *** Including module: base ***
Dec  6 04:00:41 np0005548916 dracut[1288]: *** Including module: fs-lib ***
Dec  6 04:00:41 np0005548916 dracut[1288]: *** Including module: kdumpbase ***
Dec  6 04:00:42 np0005548916 dracut[1288]: *** Including module: microcode_ctl-fw_dir_override ***
Dec  6 04:00:42 np0005548916 dracut[1288]:  microcode_ctl module: mangling fw_dir
Dec  6 04:00:42 np0005548916 dracut[1288]:    microcode_ctl: reset fw_dir to "/lib/firmware/updates /lib/firmware"
Dec  6 04:00:42 np0005548916 dracut[1288]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel"...
Dec  6 04:00:42 np0005548916 dracut[1288]:    microcode_ctl: configuration "intel" is ignored
Dec  6 04:00:42 np0005548916 dracut[1288]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-2d-07"...
Dec  6 04:00:42 np0005548916 dracut[1288]:    microcode_ctl: configuration "intel-06-2d-07" is ignored
Dec  6 04:00:42 np0005548916 dracut[1288]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4e-03"...
Dec  6 04:00:42 np0005548916 dracut[1288]:    microcode_ctl: configuration "intel-06-4e-03" is ignored
Dec  6 04:00:42 np0005548916 dracut[1288]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4f-01"...
Dec  6 04:00:42 np0005548916 dracut[1288]:    microcode_ctl: configuration "intel-06-4f-01" is ignored
Dec  6 04:00:42 np0005548916 dracut[1288]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-55-04"...
Dec  6 04:00:42 np0005548916 dracut[1288]:    microcode_ctl: configuration "intel-06-55-04" is ignored
Dec  6 04:00:42 np0005548916 dracut[1288]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-5e-03"...
Dec  6 04:00:42 np0005548916 dracut[1288]:    microcode_ctl: configuration "intel-06-5e-03" is ignored
Dec  6 04:00:42 np0005548916 dracut[1288]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8c-01"...
Dec  6 04:00:42 np0005548916 dracut[1288]:    microcode_ctl: configuration "intel-06-8c-01" is ignored
Dec  6 04:00:42 np0005548916 dracut[1288]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-0xca"...
Dec  6 04:00:42 np0005548916 dracut[1288]:    microcode_ctl: configuration "intel-06-8e-9e-0x-0xca" is ignored
Dec  6 04:00:42 np0005548916 dracut[1288]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-dell"...
Dec  6 04:00:42 np0005548916 dracut[1288]:    microcode_ctl: configuration "intel-06-8e-9e-0x-dell" is ignored
Dec  6 04:00:42 np0005548916 dracut[1288]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8f-08"...
Dec  6 04:00:42 np0005548916 dracut[1288]:    microcode_ctl: configuration "intel-06-8f-08" is ignored
Dec  6 04:00:42 np0005548916 dracut[1288]:    microcode_ctl: final fw_dir: "/lib/firmware/updates /lib/firmware"
Dec  6 04:00:42 np0005548916 dracut[1288]: *** Including module: openssl ***
Dec  6 04:00:42 np0005548916 dracut[1288]: *** Including module: shutdown ***
Dec  6 04:00:43 np0005548916 dracut[1288]: *** Including module: squash ***
Dec  6 04:00:43 np0005548916 dracut[1288]: *** Including modules done ***
Dec  6 04:00:43 np0005548916 dracut[1288]: *** Installing kernel module dependencies ***
Dec  6 04:00:43 np0005548916 dracut[1288]: *** Installing kernel module dependencies done ***
Dec  6 04:00:43 np0005548916 dracut[1288]: *** Resolving executable dependencies ***
Dec  6 04:00:44 np0005548916 irqbalance[783]: Cannot change IRQ 25 affinity: Operation not permitted
Dec  6 04:00:44 np0005548916 irqbalance[783]: IRQ 25 affinity is now unmanaged
Dec  6 04:00:44 np0005548916 irqbalance[783]: Cannot change IRQ 31 affinity: Operation not permitted
Dec  6 04:00:44 np0005548916 irqbalance[783]: IRQ 31 affinity is now unmanaged
Dec  6 04:00:44 np0005548916 irqbalance[783]: Cannot change IRQ 28 affinity: Operation not permitted
Dec  6 04:00:44 np0005548916 irqbalance[783]: IRQ 28 affinity is now unmanaged
Dec  6 04:00:44 np0005548916 irqbalance[783]: Cannot change IRQ 32 affinity: Operation not permitted
Dec  6 04:00:44 np0005548916 irqbalance[783]: IRQ 32 affinity is now unmanaged
Dec  6 04:00:44 np0005548916 irqbalance[783]: Cannot change IRQ 30 affinity: Operation not permitted
Dec  6 04:00:44 np0005548916 irqbalance[783]: IRQ 30 affinity is now unmanaged
Dec  6 04:00:44 np0005548916 irqbalance[783]: Cannot change IRQ 29 affinity: Operation not permitted
Dec  6 04:00:44 np0005548916 irqbalance[783]: IRQ 29 affinity is now unmanaged
Dec  6 04:00:45 np0005548916 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec  6 04:00:45 np0005548916 dracut[1288]: *** Resolving executable dependencies done ***
Dec  6 04:00:45 np0005548916 dracut[1288]: *** Generating early-microcode cpio image ***
Dec  6 04:00:45 np0005548916 dracut[1288]: *** Store current command line parameters ***
Dec  6 04:00:45 np0005548916 dracut[1288]: Stored kernel commandline:
Dec  6 04:00:45 np0005548916 dracut[1288]: No dracut internal kernel commandline stored in the initramfs
Dec  6 04:00:45 np0005548916 dracut[1288]: *** Install squash loader ***
Dec  6 04:00:46 np0005548916 dracut[1288]: *** Squashing the files inside the initramfs ***
Dec  6 04:00:47 np0005548916 dracut[1288]: *** Squashing the files inside the initramfs done ***
Dec  6 04:00:47 np0005548916 dracut[1288]: *** Creating image file '/boot/initramfs-5.14.0-645.el9.x86_64kdump.img' ***
Dec  6 04:00:47 np0005548916 dracut[1288]: *** Hardlinking files ***
Dec  6 04:00:47 np0005548916 dracut[1288]: *** Hardlinking files done ***
Dec  6 04:00:48 np0005548916 dracut[1288]: *** Creating initramfs image file '/boot/initramfs-5.14.0-645.el9.x86_64kdump.img' done ***
Dec  6 04:00:48 np0005548916 kdumpctl[1016]: kdump: kexec: loaded kdump kernel
Dec  6 04:00:48 np0005548916 kdumpctl[1016]: kdump: Starting kdump: [OK]
Dec  6 04:00:48 np0005548916 systemd[1]: Finished Crash recovery kernel arming.
Dec  6 04:00:48 np0005548916 systemd[1]: Startup finished in 1.552s (kernel) + 3.408s (initrd) + 16.454s (userspace) = 21.415s.
Dec  6 04:00:54 np0005548916 systemd[1]: Created slice User Slice of UID 1000.
Dec  6 04:00:54 np0005548916 systemd[1]: Starting User Runtime Directory /run/user/1000...
Dec  6 04:00:54 np0005548916 systemd-logind[788]: New session 1 of user zuul.
Dec  6 04:00:54 np0005548916 systemd[1]: Finished User Runtime Directory /run/user/1000.
Dec  6 04:00:54 np0005548916 systemd[1]: Starting User Manager for UID 1000...
Dec  6 04:00:54 np0005548916 systemd[4303]: Queued start job for default target Main User Target.
Dec  6 04:00:54 np0005548916 systemd[4303]: Created slice User Application Slice.
Dec  6 04:00:54 np0005548916 systemd[4303]: Started Mark boot as successful after the user session has run 2 minutes.
Dec  6 04:00:54 np0005548916 systemd[4303]: Started Daily Cleanup of User's Temporary Directories.
Dec  6 04:00:54 np0005548916 systemd[4303]: Reached target Paths.
Dec  6 04:00:54 np0005548916 systemd[4303]: Reached target Timers.
Dec  6 04:00:54 np0005548916 systemd[4303]: Starting D-Bus User Message Bus Socket...
Dec  6 04:00:54 np0005548916 systemd[4303]: Starting Create User's Volatile Files and Directories...
Dec  6 04:00:54 np0005548916 systemd[4303]: Listening on D-Bus User Message Bus Socket.
Dec  6 04:00:54 np0005548916 systemd[4303]: Reached target Sockets.
Dec  6 04:00:54 np0005548916 systemd[4303]: Finished Create User's Volatile Files and Directories.
Dec  6 04:00:54 np0005548916 systemd[4303]: Reached target Basic System.
Dec  6 04:00:54 np0005548916 systemd[4303]: Reached target Main User Target.
Dec  6 04:00:54 np0005548916 systemd[4303]: Startup finished in 163ms.
Dec  6 04:00:54 np0005548916 systemd[1]: Started User Manager for UID 1000.
Dec  6 04:00:54 np0005548916 systemd[1]: Started Session 1 of User zuul.
Dec  6 04:00:55 np0005548916 python3[4385]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  6 04:00:58 np0005548916 python3[4413]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  6 04:01:05 np0005548916 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Dec  6 04:01:05 np0005548916 python3[4488]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  6 04:01:06 np0005548916 python3[4528]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present
Dec  6 04:01:08 np0005548916 python3[4554]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDU0JPqo3RlcbkISWeWyZyh8N1DipPCXKbgbj83sLrBXd5pRLoLdbqBjiuLvFfP7lb5gET6+eP3VZiOMI6UHmEm8ynKQRTIQ7lxC6wlJ/5bEkQ7shEony5Dt8S+/YriKnW8SR/bfYJwGVDGiYwX9+YLTEkgtaWYCW5aOhF1JYR2fNVZQyTaBuiZFc/j1+ce31wCfSAIAFETx4TP71KVZET/mDhOPfYQSE6dNJCcZnohKVSa1SHNL0bVxbehOrQrmqmiRc81piGO4LAMvuSM3op7QTjc7lDDNoYX/DWm/O6Yd8IV5PAI5jAYm4zViXyj8K/iPfclSAUCutpd/HwsQjjiI9Ei0ObVrpLhV3PWw6UkMmfRl4sN90Bhg/95I6taoeEDSSNojukndyGr3lxM1SkEHO0ZamuvQmAOsP05x89hsZFP9E+RntviBPqrCNyyiE7JEy2H1WfIK5i0KA/BC8M+osytKOc1zBu/jI4TYPr32yUNd7mIBDzpNaUok32L4Pk= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  6 04:01:09 np0005548916 python3[4578]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:01:09 np0005548916 python3[4677]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  6 04:01:10 np0005548916 python3[4748]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765011669.4133372-252-170856646153745/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=66d341c321a043af9793d30ca9726f09_id_rsa follow=False checksum=1c48fa8bdbec038bf9f0f4b497dca115d790ad66 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:01:10 np0005548916 python3[4871]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  6 04:01:11 np0005548916 python3[4942]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765011670.3117692-307-246098217896273/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=66d341c321a043af9793d30ca9726f09_id_rsa.pub follow=False checksum=e7cbe2647d02b25f8aa52dd3d3a0ea1aa1cad833 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:01:12 np0005548916 python3[4990]: ansible-ping Invoked with data=pong
Dec  6 04:01:13 np0005548916 python3[5014]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  6 04:01:15 np0005548916 python3[5072]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None
Dec  6 04:01:16 np0005548916 python3[5104]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:01:16 np0005548916 python3[5128]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:01:17 np0005548916 python3[5152]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:01:17 np0005548916 python3[5176]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:01:17 np0005548916 python3[5200]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:01:18 np0005548916 python3[5224]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:01:19 np0005548916 python3[5250]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:01:20 np0005548916 python3[5328]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  6 04:01:20 np0005548916 python3[5401]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1765011680.0756123-32-186420266149685/source follow=False _original_basename=mirror_info.sh.j2 checksum=92d92a03afdddee82732741071f662c729080c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:01:21 np0005548916 python3[5449]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  6 04:01:21 np0005548916 python3[5473]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  6 04:01:22 np0005548916 python3[5497]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  6 04:01:22 np0005548916 python3[5521]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  6 04:01:22 np0005548916 python3[5545]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  6 04:01:23 np0005548916 python3[5569]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  6 04:01:23 np0005548916 python3[5593]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  6 04:01:23 np0005548916 python3[5617]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  6 04:01:23 np0005548916 python3[5641]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  6 04:01:24 np0005548916 python3[5665]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  6 04:01:24 np0005548916 python3[5689]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  6 04:01:24 np0005548916 python3[5713]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  6 04:01:25 np0005548916 python3[5737]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGpU/BepK3qX0NRf5Np+dOBDqzQEefhNrw2DCZaH3uWW rebtoor@monolith manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  6 04:01:25 np0005548916 python3[5761]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  6 04:01:25 np0005548916 python3[5785]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  6 04:01:25 np0005548916 python3[5809]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  6 04:01:26 np0005548916 python3[5833]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  6 04:01:26 np0005548916 python3[5857]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  6 04:01:26 np0005548916 python3[5881]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  6 04:01:27 np0005548916 python3[5905]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  6 04:01:27 np0005548916 python3[5929]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  6 04:01:27 np0005548916 python3[5953]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  6 04:01:28 np0005548916 python3[5977]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  6 04:01:28 np0005548916 python3[6001]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  6 04:01:28 np0005548916 python3[6025]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  6 04:01:28 np0005548916 python3[6049]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  6 04:01:31 np0005548916 python3[6075]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Dec  6 04:01:31 np0005548916 systemd[1]: Starting Time & Date Service...
Dec  6 04:01:31 np0005548916 systemd[1]: Started Time & Date Service.
Dec  6 04:01:31 np0005548916 systemd-timedated[6077]: Changed time zone to 'UTC' (UTC).
Dec  6 04:01:32 np0005548916 python3[6106]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:01:32 np0005548916 python3[6182]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  6 04:01:33 np0005548916 python3[6253]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1765011692.431627-252-264182487388174/source _original_basename=tmp4gtehc5d follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:01:33 np0005548916 python3[6353]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  6 04:01:34 np0005548916 python3[6424]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1765011693.3420992-303-216107513659361/source _original_basename=tmplwio4_od follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:01:35 np0005548916 python3[6526]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  6 04:01:35 np0005548916 python3[6599]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1765011694.6634886-383-9733592598989/source _original_basename=tmpml1ps5qd follow=False checksum=0200c222fd008cff1969c6c814381aad26405e22 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:01:35 np0005548916 python3[6647]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:01:36 np0005548916 python3[6673]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:01:36 np0005548916 python3[6753]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  6 04:01:37 np0005548916 python3[6826]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1765011696.3969817-453-219858681411835/source _original_basename=tmpqw1g566e follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:01:37 np0005548916 python3[6877]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163efc-24cc-c2c1-5ee8-00000000001f-1-compute1 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:01:38 np0005548916 python3[6905]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env#012 _uses_shell=True zuul_log_id=fa163efc-24cc-c2c1-5ee8-000000000020-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None
Dec  6 04:01:39 np0005548916 python3[6933]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:01:59 np0005548916 python3[6959]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:02:01 np0005548916 systemd[1]: systemd-timedated.service: Deactivated successfully.
Dec  6 04:02:59 np0005548916 systemd-logind[788]: Session 1 logged out. Waiting for processes to exit.
Dec  6 04:03:03 np0005548916 kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Dec  6 04:03:03 np0005548916 kernel: pci 0000:00:07.0: BAR 0 [io  0x0000-0x003f]
Dec  6 04:03:03 np0005548916 kernel: pci 0000:00:07.0: BAR 1 [mem 0x00000000-0x00000fff]
Dec  6 04:03:03 np0005548916 kernel: pci 0000:00:07.0: BAR 4 [mem 0x00000000-0x00003fff 64bit pref]
Dec  6 04:03:03 np0005548916 kernel: pci 0000:00:07.0: ROM [mem 0x00000000-0x0007ffff pref]
Dec  6 04:03:03 np0005548916 kernel: pci 0000:00:07.0: ROM [mem 0xc0000000-0xc007ffff pref]: assigned
Dec  6 04:03:03 np0005548916 kernel: pci 0000:00:07.0: BAR 4 [mem 0x240000000-0x240003fff 64bit pref]: assigned
Dec  6 04:03:03 np0005548916 kernel: pci 0000:00:07.0: BAR 1 [mem 0xc0080000-0xc0080fff]: assigned
Dec  6 04:03:03 np0005548916 kernel: pci 0000:00:07.0: BAR 0 [io  0x1000-0x103f]: assigned
Dec  6 04:03:03 np0005548916 kernel: virtio-pci 0000:00:07.0: enabling device (0000 -> 0003)
Dec  6 04:03:03 np0005548916 NetworkManager[859]: <info>  [1765011783.7222] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Dec  6 04:03:03 np0005548916 systemd-udevd[6963]: Network interface NamePolicy= disabled on kernel command line.
Dec  6 04:03:03 np0005548916 NetworkManager[859]: <info>  [1765011783.7504] device (eth1): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  6 04:03:03 np0005548916 NetworkManager[859]: <info>  [1765011783.7533] settings: (eth1): created default wired connection 'Wired connection 1'
Dec  6 04:03:03 np0005548916 NetworkManager[859]: <info>  [1765011783.7537] device (eth1): carrier: link connected
Dec  6 04:03:03 np0005548916 NetworkManager[859]: <info>  [1765011783.7539] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Dec  6 04:03:03 np0005548916 NetworkManager[859]: <info>  [1765011783.7544] policy: auto-activating connection 'Wired connection 1' (d0a7d597-e5ec-3c93-9ea9-45506a05a0f2)
Dec  6 04:03:03 np0005548916 NetworkManager[859]: <info>  [1765011783.7547] device (eth1): Activation: starting connection 'Wired connection 1' (d0a7d597-e5ec-3c93-9ea9-45506a05a0f2)
Dec  6 04:03:03 np0005548916 NetworkManager[859]: <info>  [1765011783.7548] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  6 04:03:03 np0005548916 NetworkManager[859]: <info>  [1765011783.7551] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  6 04:03:03 np0005548916 NetworkManager[859]: <info>  [1765011783.7554] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  6 04:03:03 np0005548916 NetworkManager[859]: <info>  [1765011783.7558] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Dec  6 04:03:03 np0005548916 systemd[4303]: Starting Mark boot as successful...
Dec  6 04:03:03 np0005548916 systemd[4303]: Finished Mark boot as successful.
Dec  6 04:03:04 np0005548916 systemd-logind[788]: New session 3 of user zuul.
Dec  6 04:03:04 np0005548916 systemd[1]: Started Session 3 of User zuul.
Dec  6 04:03:04 np0005548916 python3[6994]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163efc-24cc-5a9f-9569-00000000018f-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:03:15 np0005548916 python3[7074]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  6 04:03:15 np0005548916 python3[7147]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765011794.7520046-155-265520938592310/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=830b9277befaf6f767205b89543169fefeef2ac1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:03:16 np0005548916 python3[7197]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  6 04:03:16 np0005548916 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Dec  6 04:03:16 np0005548916 systemd[1]: Stopped Network Manager Wait Online.
Dec  6 04:03:16 np0005548916 systemd[1]: Stopping Network Manager Wait Online...
Dec  6 04:03:16 np0005548916 NetworkManager[859]: <info>  [1765011796.0699] caught SIGTERM, shutting down normally.
Dec  6 04:03:16 np0005548916 systemd[1]: Stopping Network Manager...
Dec  6 04:03:16 np0005548916 NetworkManager[859]: <info>  [1765011796.0713] dhcp4 (eth0): canceled DHCP transaction
Dec  6 04:03:16 np0005548916 NetworkManager[859]: <info>  [1765011796.0713] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec  6 04:03:16 np0005548916 NetworkManager[859]: <info>  [1765011796.0714] dhcp4 (eth0): state changed no lease
Dec  6 04:03:16 np0005548916 NetworkManager[859]: <info>  [1765011796.0719] manager: NetworkManager state is now CONNECTING
Dec  6 04:03:16 np0005548916 NetworkManager[859]: <info>  [1765011796.0812] dhcp4 (eth1): canceled DHCP transaction
Dec  6 04:03:16 np0005548916 NetworkManager[859]: <info>  [1765011796.0812] dhcp4 (eth1): state changed no lease
Dec  6 04:03:16 np0005548916 NetworkManager[859]: <info>  [1765011796.0885] exiting (success)
Dec  6 04:03:16 np0005548916 systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec  6 04:03:16 np0005548916 systemd[1]: Started Network Manager Script Dispatcher Service.
Dec  6 04:03:16 np0005548916 systemd[1]: NetworkManager.service: Deactivated successfully.
Dec  6 04:03:16 np0005548916 systemd[1]: Stopped Network Manager.
Dec  6 04:03:16 np0005548916 systemd[1]: NetworkManager.service: Consumed 1.088s CPU time, 10.0M memory peak.
Dec  6 04:03:16 np0005548916 systemd[1]: Starting Network Manager...
Dec  6 04:03:16 np0005548916 NetworkManager[7209]: <info>  [1765011796.1555] NetworkManager (version 1.54.1-1.el9) is starting... (after a restart, boot:27715b31-3399-4bbf-a0fa-54836c80918e)
Dec  6 04:03:16 np0005548916 NetworkManager[7209]: <info>  [1765011796.1558] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Dec  6 04:03:16 np0005548916 NetworkManager[7209]: <info>  [1765011796.1620] manager[0x565155928070]: monitoring kernel firmware directory '/lib/firmware'.
Dec  6 04:03:16 np0005548916 systemd[1]: Starting Hostname Service...
Dec  6 04:03:16 np0005548916 systemd[1]: Started Hostname Service.
Dec  6 04:03:16 np0005548916 NetworkManager[7209]: <info>  [1765011796.2678] hostname: hostname: using hostnamed
Dec  6 04:03:16 np0005548916 NetworkManager[7209]: <info>  [1765011796.2678] hostname: static hostname changed from (none) to "np0005548916.novalocal"
Dec  6 04:03:16 np0005548916 NetworkManager[7209]: <info>  [1765011796.2685] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Dec  6 04:03:16 np0005548916 NetworkManager[7209]: <info>  [1765011796.2692] manager[0x565155928070]: rfkill: Wi-Fi hardware radio set enabled
Dec  6 04:03:16 np0005548916 NetworkManager[7209]: <info>  [1765011796.2692] manager[0x565155928070]: rfkill: WWAN hardware radio set enabled
Dec  6 04:03:16 np0005548916 NetworkManager[7209]: <info>  [1765011796.2730] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Dec  6 04:03:16 np0005548916 NetworkManager[7209]: <info>  [1765011796.2730] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Dec  6 04:03:16 np0005548916 NetworkManager[7209]: <info>  [1765011796.2731] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Dec  6 04:03:16 np0005548916 NetworkManager[7209]: <info>  [1765011796.2732] manager: Networking is enabled by state file
Dec  6 04:03:16 np0005548916 NetworkManager[7209]: <info>  [1765011796.2735] settings: Loaded settings plugin: keyfile (internal)
Dec  6 04:03:16 np0005548916 NetworkManager[7209]: <info>  [1765011796.2740] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Dec  6 04:03:16 np0005548916 NetworkManager[7209]: <info>  [1765011796.2774] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Dec  6 04:03:16 np0005548916 NetworkManager[7209]: <info>  [1765011796.2786] dhcp: init: Using DHCP client 'internal'
Dec  6 04:03:16 np0005548916 NetworkManager[7209]: <info>  [1765011796.2790] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Dec  6 04:03:16 np0005548916 NetworkManager[7209]: <info>  [1765011796.2796] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  6 04:03:16 np0005548916 NetworkManager[7209]: <info>  [1765011796.2803] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Dec  6 04:03:16 np0005548916 NetworkManager[7209]: <info>  [1765011796.2812] device (lo): Activation: starting connection 'lo' (04d45710-56f6-4696-9924-dd30b84bf74f)
Dec  6 04:03:16 np0005548916 NetworkManager[7209]: <info>  [1765011796.2820] device (eth0): carrier: link connected
Dec  6 04:03:16 np0005548916 NetworkManager[7209]: <info>  [1765011796.2825] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Dec  6 04:03:16 np0005548916 NetworkManager[7209]: <info>  [1765011796.2830] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Dec  6 04:03:16 np0005548916 NetworkManager[7209]: <info>  [1765011796.2830] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Dec  6 04:03:16 np0005548916 NetworkManager[7209]: <info>  [1765011796.2837] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Dec  6 04:03:16 np0005548916 NetworkManager[7209]: <info>  [1765011796.2843] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Dec  6 04:03:16 np0005548916 NetworkManager[7209]: <info>  [1765011796.2850] device (eth1): carrier: link connected
Dec  6 04:03:16 np0005548916 NetworkManager[7209]: <info>  [1765011796.2855] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Dec  6 04:03:16 np0005548916 NetworkManager[7209]: <info>  [1765011796.2860] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (d0a7d597-e5ec-3c93-9ea9-45506a05a0f2) (indicated)
Dec  6 04:03:16 np0005548916 NetworkManager[7209]: <info>  [1765011796.2861] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Dec  6 04:03:16 np0005548916 NetworkManager[7209]: <info>  [1765011796.2868] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Dec  6 04:03:16 np0005548916 NetworkManager[7209]: <info>  [1765011796.2875] device (eth1): Activation: starting connection 'Wired connection 1' (d0a7d597-e5ec-3c93-9ea9-45506a05a0f2)
Dec  6 04:03:16 np0005548916 NetworkManager[7209]: <info>  [1765011796.2882] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Dec  6 04:03:16 np0005548916 systemd[1]: Started Network Manager.
Dec  6 04:03:16 np0005548916 NetworkManager[7209]: <info>  [1765011796.2885] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Dec  6 04:03:16 np0005548916 NetworkManager[7209]: <info>  [1765011796.2887] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Dec  6 04:03:16 np0005548916 NetworkManager[7209]: <info>  [1765011796.2888] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Dec  6 04:03:16 np0005548916 NetworkManager[7209]: <info>  [1765011796.2891] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Dec  6 04:03:16 np0005548916 NetworkManager[7209]: <info>  [1765011796.2894] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Dec  6 04:03:16 np0005548916 NetworkManager[7209]: <info>  [1765011796.2896] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Dec  6 04:03:16 np0005548916 NetworkManager[7209]: <info>  [1765011796.2899] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Dec  6 04:03:16 np0005548916 NetworkManager[7209]: <info>  [1765011796.2904] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Dec  6 04:03:16 np0005548916 NetworkManager[7209]: <info>  [1765011796.2913] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Dec  6 04:03:16 np0005548916 NetworkManager[7209]: <info>  [1765011796.2924] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec  6 04:03:16 np0005548916 NetworkManager[7209]: <info>  [1765011796.2934] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Dec  6 04:03:16 np0005548916 NetworkManager[7209]: <info>  [1765011796.2936] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Dec  6 04:03:16 np0005548916 NetworkManager[7209]: <info>  [1765011796.2953] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Dec  6 04:03:16 np0005548916 NetworkManager[7209]: <info>  [1765011796.2959] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Dec  6 04:03:16 np0005548916 NetworkManager[7209]: <info>  [1765011796.2964] device (lo): Activation: successful, device activated.
Dec  6 04:03:16 np0005548916 NetworkManager[7209]: <info>  [1765011796.2971] dhcp4 (eth0): state changed new lease, address=38.102.83.113
Dec  6 04:03:16 np0005548916 NetworkManager[7209]: <info>  [1765011796.2977] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Dec  6 04:03:16 np0005548916 systemd[1]: Starting Network Manager Wait Online...
Dec  6 04:03:16 np0005548916 NetworkManager[7209]: <info>  [1765011796.3157] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Dec  6 04:03:16 np0005548916 NetworkManager[7209]: <info>  [1765011796.3221] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Dec  6 04:03:16 np0005548916 NetworkManager[7209]: <info>  [1765011796.3223] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Dec  6 04:03:16 np0005548916 NetworkManager[7209]: <info>  [1765011796.3227] manager: NetworkManager state is now CONNECTED_SITE
Dec  6 04:03:16 np0005548916 NetworkManager[7209]: <info>  [1765011796.3232] device (eth0): Activation: successful, device activated.
Dec  6 04:03:16 np0005548916 NetworkManager[7209]: <info>  [1765011796.3238] manager: NetworkManager state is now CONNECTED_GLOBAL
Dec  6 04:03:16 np0005548916 python3[7281]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163efc-24cc-5a9f-9569-0000000000c8-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:03:26 np0005548916 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec  6 04:03:46 np0005548916 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Dec  6 04:04:01 np0005548916 NetworkManager[7209]: <info>  [1765011841.3250] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Dec  6 04:04:01 np0005548916 systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec  6 04:04:01 np0005548916 systemd[1]: Started Network Manager Script Dispatcher Service.
Dec  6 04:04:01 np0005548916 NetworkManager[7209]: <info>  [1765011841.3624] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Dec  6 04:04:01 np0005548916 NetworkManager[7209]: <info>  [1765011841.3629] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Dec  6 04:04:01 np0005548916 NetworkManager[7209]: <info>  [1765011841.3641] device (eth1): Activation: successful, device activated.
Dec  6 04:04:01 np0005548916 NetworkManager[7209]: <info>  [1765011841.3649] manager: startup complete
Dec  6 04:04:01 np0005548916 NetworkManager[7209]: <info>  [1765011841.3651] device (eth1): state change: activated -> failed (reason 'ip-config-unavailable', managed-type: 'full')
Dec  6 04:04:01 np0005548916 NetworkManager[7209]: <warn>  [1765011841.3663] device (eth1): Activation: failed for connection 'Wired connection 1'
Dec  6 04:04:01 np0005548916 NetworkManager[7209]: <info>  [1765011841.3674] device (eth1): state change: failed -> disconnected (reason 'none', managed-type: 'full')
Dec  6 04:04:01 np0005548916 systemd[1]: Finished Network Manager Wait Online.
Dec  6 04:04:01 np0005548916 NetworkManager[7209]: <info>  [1765011841.3792] dhcp4 (eth1): canceled DHCP transaction
Dec  6 04:04:01 np0005548916 NetworkManager[7209]: <info>  [1765011841.3792] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Dec  6 04:04:01 np0005548916 NetworkManager[7209]: <info>  [1765011841.3792] dhcp4 (eth1): state changed no lease
Dec  6 04:04:01 np0005548916 NetworkManager[7209]: <info>  [1765011841.3813] policy: auto-activating connection 'ci-private-network' (f3fb407f-d9e1-5507-a7f7-856240ad9666)
Dec  6 04:04:01 np0005548916 NetworkManager[7209]: <info>  [1765011841.3818] device (eth1): Activation: starting connection 'ci-private-network' (f3fb407f-d9e1-5507-a7f7-856240ad9666)
Dec  6 04:04:01 np0005548916 NetworkManager[7209]: <info>  [1765011841.3820] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  6 04:04:01 np0005548916 NetworkManager[7209]: <info>  [1765011841.3825] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  6 04:04:01 np0005548916 NetworkManager[7209]: <info>  [1765011841.3833] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  6 04:04:01 np0005548916 NetworkManager[7209]: <info>  [1765011841.3845] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec  6 04:04:01 np0005548916 NetworkManager[7209]: <info>  [1765011841.3888] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec  6 04:04:01 np0005548916 NetworkManager[7209]: <info>  [1765011841.3892] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec  6 04:04:01 np0005548916 NetworkManager[7209]: <info>  [1765011841.3904] device (eth1): Activation: successful, device activated.
Dec  6 04:04:11 np0005548916 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec  6 04:04:16 np0005548916 systemd[1]: session-3.scope: Deactivated successfully.
Dec  6 04:04:16 np0005548916 systemd[1]: session-3.scope: Consumed 1.722s CPU time.
Dec  6 04:04:16 np0005548916 systemd-logind[788]: Session 3 logged out. Waiting for processes to exit.
Dec  6 04:04:16 np0005548916 systemd-logind[788]: Removed session 3.
Dec  6 04:04:51 np0005548916 systemd-logind[788]: New session 4 of user zuul.
Dec  6 04:04:51 np0005548916 systemd[1]: Started Session 4 of User zuul.
Dec  6 04:04:52 np0005548916 python3[7392]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  6 04:04:52 np0005548916 python3[7465]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765011892.0843112-373-172733315032397/source _original_basename=tmpce7o65fw follow=False checksum=81d87914000d1f03e4ba3a0a6e4eda468c65f433 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:04:55 np0005548916 systemd[1]: session-4.scope: Deactivated successfully.
Dec  6 04:04:55 np0005548916 systemd-logind[788]: Session 4 logged out. Waiting for processes to exit.
Dec  6 04:04:55 np0005548916 systemd-logind[788]: Removed session 4.
Dec  6 04:06:48 np0005548916 systemd[4303]: Created slice User Background Tasks Slice.
Dec  6 04:06:48 np0005548916 systemd[4303]: Starting Cleanup of User's Temporary Files and Directories...
Dec  6 04:06:48 np0005548916 systemd[4303]: Finished Cleanup of User's Temporary Files and Directories.
Dec  6 04:10:24 np0005548916 systemd-logind[788]: New session 5 of user zuul.
Dec  6 04:10:24 np0005548916 systemd[1]: Started Session 5 of User zuul.
Dec  6 04:10:24 np0005548916 python3[7529]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda#012 _uses_shell=True zuul_log_id=fa163efc-24cc-6aeb-b52e-000000001cd4-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:10:25 np0005548916 python3[7557]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:10:25 np0005548916 python3[7584]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:10:25 np0005548916 python3[7610]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:10:26 np0005548916 python3[7636]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:10:26 np0005548916 python3[7662]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system.conf.d state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:10:27 np0005548916 python3[7740]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system.conf.d/override.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  6 04:10:27 np0005548916 python3[7813]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system.conf.d/override.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765012227.1862109-517-246331874625000/source _original_basename=tmpujynt52a follow=False checksum=a05098bd3d2321238ea1169d0e6f135b35b392d4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:10:28 np0005548916 python3[7863]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec  6 04:10:29 np0005548916 systemd[1]: Reloading.
Dec  6 04:10:29 np0005548916 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:10:31 np0005548916 python3[7919]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None
Dec  6 04:10:31 np0005548916 python3[7945]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:10:31 np0005548916 python3[7973]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:10:31 np0005548916 python3[8001]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:10:32 np0005548916 python3[8029]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:10:33 np0005548916 python3[8056]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init";    cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system";  cat /sys/fs/cgroup/system.slice/io.max; echo "user";    cat /sys/fs/cgroup/user.slice/io.max;#012 _uses_shell=True zuul_log_id=fa163efc-24cc-6aeb-b52e-000000001cdb-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:10:33 np0005548916 python3[8086]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec  6 04:10:36 np0005548916 systemd[1]: session-5.scope: Deactivated successfully.
Dec  6 04:10:36 np0005548916 systemd[1]: session-5.scope: Consumed 4.261s CPU time.
Dec  6 04:10:36 np0005548916 systemd-logind[788]: Session 5 logged out. Waiting for processes to exit.
Dec  6 04:10:36 np0005548916 systemd-logind[788]: Removed session 5.
Dec  6 04:10:38 np0005548916 systemd-logind[788]: New session 6 of user zuul.
Dec  6 04:10:38 np0005548916 systemd[1]: Started Session 6 of User zuul.
Dec  6 04:10:38 np0005548916 python3[8119]: ansible-ansible.legacy.dnf Invoked with name=['podman', 'buildah'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Dec  6 04:10:53 np0005548916 kernel: SELinux:  Converting 386 SID table entries...
Dec  6 04:10:53 np0005548916 kernel: SELinux:  policy capability network_peer_controls=1
Dec  6 04:10:53 np0005548916 kernel: SELinux:  policy capability open_perms=1
Dec  6 04:10:53 np0005548916 kernel: SELinux:  policy capability extended_socket_class=1
Dec  6 04:10:53 np0005548916 kernel: SELinux:  policy capability always_check_network=0
Dec  6 04:10:53 np0005548916 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec  6 04:10:53 np0005548916 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec  6 04:10:53 np0005548916 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec  6 04:11:03 np0005548916 kernel: SELinux:  Converting 386 SID table entries...
Dec  6 04:11:03 np0005548916 kernel: SELinux:  policy capability network_peer_controls=1
Dec  6 04:11:03 np0005548916 kernel: SELinux:  policy capability open_perms=1
Dec  6 04:11:03 np0005548916 kernel: SELinux:  policy capability extended_socket_class=1
Dec  6 04:11:03 np0005548916 kernel: SELinux:  policy capability always_check_network=0
Dec  6 04:11:03 np0005548916 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec  6 04:11:03 np0005548916 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec  6 04:11:03 np0005548916 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec  6 04:11:12 np0005548916 kernel: SELinux:  Converting 386 SID table entries...
Dec  6 04:11:12 np0005548916 kernel: SELinux:  policy capability network_peer_controls=1
Dec  6 04:11:12 np0005548916 kernel: SELinux:  policy capability open_perms=1
Dec  6 04:11:12 np0005548916 kernel: SELinux:  policy capability extended_socket_class=1
Dec  6 04:11:12 np0005548916 kernel: SELinux:  policy capability always_check_network=0
Dec  6 04:11:12 np0005548916 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec  6 04:11:12 np0005548916 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec  6 04:11:12 np0005548916 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec  6 04:11:14 np0005548916 setsebool[8186]: The virt_use_nfs policy boolean was changed to 1 by root
Dec  6 04:11:14 np0005548916 setsebool[8186]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root
Dec  6 04:11:25 np0005548916 kernel: SELinux:  Converting 389 SID table entries...
Dec  6 04:11:25 np0005548916 kernel: SELinux:  policy capability network_peer_controls=1
Dec  6 04:11:25 np0005548916 kernel: SELinux:  policy capability open_perms=1
Dec  6 04:11:25 np0005548916 kernel: SELinux:  policy capability extended_socket_class=1
Dec  6 04:11:25 np0005548916 kernel: SELinux:  policy capability always_check_network=0
Dec  6 04:11:25 np0005548916 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec  6 04:11:25 np0005548916 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec  6 04:11:25 np0005548916 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec  6 04:11:43 np0005548916 dbus-broker-launch[774]: avc:  op=load_policy lsm=selinux seqno=6 res=1
Dec  6 04:11:44 np0005548916 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec  6 04:11:44 np0005548916 systemd[1]: Starting man-db-cache-update.service...
Dec  6 04:11:44 np0005548916 systemd[1]: Reloading.
Dec  6 04:11:44 np0005548916 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:11:44 np0005548916 systemd[1]: Queuing reload/restart jobs for marked units…
Dec  6 04:11:47 np0005548916 python3[11217]: ansible-ansible.legacy.command Invoked with _raw_params=echo "openstack-k8s-operators+cirobot"#012 _uses_shell=True zuul_log_id=fa163efc-24cc-d561-0a5b-00000000000c-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:11:48 np0005548916 kernel: evm: overlay not supported
Dec  6 04:11:48 np0005548916 systemd[4303]: Starting D-Bus User Message Bus...
Dec  6 04:11:48 np0005548916 dbus-broker-launch[12169]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Dec  6 04:11:48 np0005548916 dbus-broker-launch[12169]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Dec  6 04:11:48 np0005548916 systemd[4303]: Started D-Bus User Message Bus.
Dec  6 04:11:48 np0005548916 dbus-broker-lau[12169]: Ready
Dec  6 04:11:48 np0005548916 systemd[4303]: selinux: avc:  op=load_policy lsm=selinux seqno=6 res=1
Dec  6 04:11:48 np0005548916 systemd[4303]: Created slice Slice /user.
Dec  6 04:11:48 np0005548916 systemd[4303]: podman-12070.scope: unit configures an IP firewall, but not running as root.
Dec  6 04:11:48 np0005548916 systemd[4303]: (This warning is only shown for the first unit using IP firewalling.)
Dec  6 04:11:48 np0005548916 systemd[4303]: Started podman-12070.scope.
Dec  6 04:11:48 np0005548916 systemd[4303]: Started podman-pause-792dbf6a.scope.
Dec  6 04:11:49 np0005548916 systemd[1]: session-6.scope: Deactivated successfully.
Dec  6 04:11:49 np0005548916 systemd[1]: session-6.scope: Consumed 1min 3.444s CPU time.
Dec  6 04:11:49 np0005548916 systemd-logind[788]: Session 6 logged out. Waiting for processes to exit.
Dec  6 04:11:49 np0005548916 systemd-logind[788]: Removed session 6.
Dec  6 04:12:18 np0005548916 systemd-logind[788]: New session 7 of user zuul.
Dec  6 04:12:18 np0005548916 systemd[1]: Started Session 7 of User zuul.
Dec  6 04:12:18 np0005548916 python3[23971]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBK/b/hDus+zgErbxpiAu4axJ55LMjNixMhoE4DoEU6Wq/xn30MdVWwMPMhgQamY6n3JqihnzwOz1OzKhBTCdzls= zuul@np0005548914.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  6 04:12:19 np0005548916 python3[24155]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBK/b/hDus+zgErbxpiAu4axJ55LMjNixMhoE4DoEU6Wq/xn30MdVWwMPMhgQamY6n3JqihnzwOz1OzKhBTCdzls= zuul@np0005548914.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  6 04:12:20 np0005548916 python3[24489]: ansible-ansible.builtin.user Invoked with name=cloud-admin shell=/bin/bash state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005548916.novalocal update_password=always uid=None group=None groups=None comment=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Dec  6 04:12:20 np0005548916 python3[24712]: ansible-ansible.posix.authorized_key Invoked with user=cloud-admin key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBK/b/hDus+zgErbxpiAu4axJ55LMjNixMhoE4DoEU6Wq/xn30MdVWwMPMhgQamY6n3JqihnzwOz1OzKhBTCdzls= zuul@np0005548914.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  6 04:12:21 np0005548916 python3[25001]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/cloud-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  6 04:12:21 np0005548916 python3[25276]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/cloud-admin mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1765012340.8469179-152-189411988683337/source _original_basename=tmpufrff3wp follow=False checksum=e7614e5ad3ab06eaae55b8efaa2ed81b63ea5634 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:12:22 np0005548916 python3[25600]: ansible-ansible.builtin.hostname Invoked with name=compute-1 use=systemd
Dec  6 04:12:22 np0005548916 systemd[1]: Starting Hostname Service...
Dec  6 04:12:22 np0005548916 systemd[1]: Started Hostname Service.
Dec  6 04:12:22 np0005548916 systemd-hostnamed[25700]: Changed pretty hostname to 'compute-1'
Dec  6 04:12:22 np0005548916 systemd-hostnamed[25700]: Hostname set to <compute-1> (static)
Dec  6 04:12:22 np0005548916 NetworkManager[7209]: <info>  [1765012342.6127] hostname: static hostname changed from "np0005548916.novalocal" to "compute-1"
Dec  6 04:12:22 np0005548916 systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec  6 04:12:22 np0005548916 systemd[1]: Started Network Manager Script Dispatcher Service.
Dec  6 04:12:23 np0005548916 systemd-logind[788]: Session 7 logged out. Waiting for processes to exit.
Dec  6 04:12:23 np0005548916 systemd[1]: session-7.scope: Deactivated successfully.
Dec  6 04:12:23 np0005548916 systemd[1]: session-7.scope: Consumed 2.503s CPU time.
Dec  6 04:12:23 np0005548916 systemd-logind[788]: Removed session 7.
Dec  6 04:12:24 np0005548916 irqbalance[783]: Cannot change IRQ 26 affinity: Operation not permitted
Dec  6 04:12:24 np0005548916 irqbalance[783]: IRQ 26 affinity is now unmanaged
Dec  6 04:12:32 np0005548916 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec  6 04:12:34 np0005548916 irqbalance[783]: Cannot change IRQ 27 affinity: Operation not permitted
Dec  6 04:12:34 np0005548916 irqbalance[783]: IRQ 27 affinity is now unmanaged
Dec  6 04:12:34 np0005548916 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec  6 04:12:34 np0005548916 systemd[1]: Finished man-db-cache-update.service.
Dec  6 04:12:34 np0005548916 systemd[1]: man-db-cache-update.service: Consumed 1min 2.454s CPU time.
Dec  6 04:12:34 np0005548916 systemd[1]: run-r3c7901b7490f4b96aebac3fa603e46bc.service: Deactivated successfully.
Dec  6 04:12:52 np0005548916 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Dec  6 04:15:48 np0005548916 systemd[1]: Starting Cleanup of Temporary Directories...
Dec  6 04:15:48 np0005548916 systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
Dec  6 04:15:48 np0005548916 systemd[1]: Finished Cleanup of Temporary Directories.
Dec  6 04:15:48 np0005548916 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully.
Dec  6 04:16:07 np0005548916 systemd-logind[788]: New session 8 of user zuul.
Dec  6 04:16:07 np0005548916 systemd[1]: Started Session 8 of User zuul.
Dec  6 04:16:07 np0005548916 python3[30001]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  6 04:16:10 np0005548916 python3[30117]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  6 04:16:10 np0005548916 python3[30190]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1765012570.2087073-33925-92850036764109/source mode=0755 _original_basename=delorean.repo follow=False checksum=39c885eb875fd03e010d1b0454241c26b121dfb2 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:16:11 np0005548916 python3[30216]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean-antelope-testing.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  6 04:16:11 np0005548916 python3[30289]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1765012570.2087073-33925-92850036764109/source mode=0755 _original_basename=delorean-antelope-testing.repo follow=False checksum=4ebc56dead962b5d40b8d420dad43b948b84d3fc backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:16:11 np0005548916 python3[30315]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-highavailability.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  6 04:16:12 np0005548916 python3[30388]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1765012570.2087073-33925-92850036764109/source mode=0755 _original_basename=repo-setup-centos-highavailability.repo follow=False checksum=55d0f695fd0d8f47cbc3044ce0dcf5f88862490f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:16:12 np0005548916 python3[30414]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-powertools.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  6 04:16:12 np0005548916 python3[30487]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1765012570.2087073-33925-92850036764109/source mode=0755 _original_basename=repo-setup-centos-powertools.repo follow=False checksum=4b0cf99aa89c5c5be0151545863a7a7568f67568 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:16:13 np0005548916 python3[30513]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-appstream.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  6 04:16:13 np0005548916 python3[30586]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1765012570.2087073-33925-92850036764109/source mode=0755 _original_basename=repo-setup-centos-appstream.repo follow=False checksum=e89244d2503b2996429dda1857290c1e91e393a1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:16:13 np0005548916 python3[30612]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-baseos.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  6 04:16:14 np0005548916 python3[30685]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1765012570.2087073-33925-92850036764109/source mode=0755 _original_basename=repo-setup-centos-baseos.repo follow=False checksum=36d926db23a40dbfa5c84b5e4d43eac6fa2301d6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:16:14 np0005548916 python3[30711]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo.md5 follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  6 04:16:15 np0005548916 python3[30784]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1765012570.2087073-33925-92850036764109/source mode=0755 _original_basename=delorean.repo.md5 follow=False checksum=6e18e2038d54303b4926db53c0b6cced515a9151 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:16:26 np0005548916 python3[30832]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:21:26 np0005548916 systemd[1]: session-8.scope: Deactivated successfully.
Dec  6 04:21:26 np0005548916 systemd[1]: session-8.scope: Consumed 5.601s CPU time.
Dec  6 04:21:26 np0005548916 systemd-logind[788]: Session 8 logged out. Waiting for processes to exit.
Dec  6 04:21:26 np0005548916 systemd-logind[788]: Removed session 8.
Dec  6 04:23:38 np0005548916 systemd[1]: Starting dnf makecache...
Dec  6 04:23:38 np0005548916 dnf[30838]: Failed determining last makecache time.
Dec  6 04:23:38 np0005548916 dnf[30838]: delorean-openstack-barbican-42b4c41831408a8e323 405 kB/s |  13 kB     00:00
Dec  6 04:23:38 np0005548916 dnf[30838]: delorean-python-glean-10df0bd91b9bc5c9fd9cc02d7 3.3 MB/s |  65 kB     00:00
Dec  6 04:23:39 np0005548916 dnf[30838]: delorean-openstack-cinder-1c00d6490d88e436f26ef 1.4 MB/s |  32 kB     00:00
Dec  6 04:23:39 np0005548916 dnf[30838]: delorean-python-stevedore-c4acc5639fd2329372142 6.4 MB/s | 131 kB     00:00
Dec  6 04:23:39 np0005548916 dnf[30838]: delorean-python-cloudkitty-tests-tempest-2c80f8 1.4 MB/s |  32 kB     00:00
Dec  6 04:23:39 np0005548916 dnf[30838]: delorean-os-net-config-d0cedbdb788d43e5c7551df5  13 MB/s | 349 kB     00:00
Dec  6 04:23:39 np0005548916 dnf[30838]: delorean-openstack-nova-6f8decf0b4f1aa2e96292b6 1.8 MB/s |  42 kB     00:00
Dec  6 04:23:39 np0005548916 dnf[30838]: delorean-python-designate-tests-tempest-347fdbc 851 kB/s |  18 kB     00:00
Dec  6 04:23:39 np0005548916 dnf[30838]: delorean-openstack-glance-1fd12c29b339f30fe823e 888 kB/s |  18 kB     00:00
Dec  6 04:23:39 np0005548916 dnf[30838]: delorean-openstack-keystone-e4b40af0ae3698fbbbb 1.2 MB/s |  29 kB     00:00
Dec  6 04:23:39 np0005548916 dnf[30838]: delorean-openstack-manila-3c01b7181572c95dac462 622 kB/s |  25 kB     00:00
Dec  6 04:23:39 np0005548916 dnf[30838]: delorean-python-whitebox-neutron-tests-tempest- 6.9 MB/s | 154 kB     00:00
Dec  6 04:23:39 np0005548916 dnf[30838]: delorean-openstack-octavia-ba397f07a7331190208c 1.2 MB/s |  26 kB     00:00
Dec  6 04:23:39 np0005548916 dnf[30838]: delorean-openstack-watcher-c014f81a8647287f6dcc 771 kB/s |  16 kB     00:00
Dec  6 04:23:39 np0005548916 dnf[30838]: delorean-ansible-config_template-5ccaa22121a7ff 373 kB/s | 7.4 kB     00:00
Dec  6 04:23:39 np0005548916 dnf[30838]: delorean-puppet-ceph-7352068d7b8c84ded636ab3158 7.0 MB/s | 144 kB     00:00
Dec  6 04:23:39 np0005548916 dnf[30838]: delorean-openstack-swift-dc98a8463506ac520c469a 622 kB/s |  14 kB     00:00
Dec  6 04:23:39 np0005548916 dnf[30838]: delorean-python-tempestconf-8515371b7cceebd4282 2.8 MB/s |  53 kB     00:00
Dec  6 04:23:39 np0005548916 dnf[30838]: delorean-openstack-heat-ui-013accbfd179753bc3f0 4.3 MB/s |  96 kB     00:00
Dec  6 04:23:40 np0005548916 dnf[30838]: CentOS Stream 9 - BaseOS                         24 kB/s | 7.3 kB     00:00
Dec  6 04:23:40 np0005548916 dnf[30838]: CentOS Stream 9 - AppStream                      48 kB/s | 7.4 kB     00:00
Dec  6 04:23:40 np0005548916 dnf[30838]: CentOS Stream 9 - CRB                            32 kB/s | 7.2 kB     00:00
Dec  6 04:23:41 np0005548916 dnf[30838]: CentOS Stream 9 - Extras packages                28 kB/s | 8.3 kB     00:00
Dec  6 04:23:41 np0005548916 dnf[30838]: dlrn-antelope-testing                            27 MB/s | 1.1 MB     00:00
Dec  6 04:23:41 np0005548916 dnf[30838]: dlrn-antelope-build-deps                         16 MB/s | 461 kB     00:00
Dec  6 04:23:41 np0005548916 dnf[30838]: centos9-rabbitmq                                8.1 MB/s | 123 kB     00:00
Dec  6 04:23:41 np0005548916 dnf[30838]: centos9-storage                                  20 MB/s | 415 kB     00:00
Dec  6 04:23:41 np0005548916 dnf[30838]: centos9-opstools                                3.3 MB/s |  51 kB     00:00
Dec  6 04:23:42 np0005548916 dnf[30838]: NFV SIG OpenvSwitch                              26 MB/s | 456 kB     00:00
Dec  6 04:23:42 np0005548916 dnf[30838]: repo-setup-centos-appstream                      71 MB/s |  25 MB     00:00
Dec  6 04:23:48 np0005548916 dnf[30838]: repo-setup-centos-baseos                         70 MB/s | 8.8 MB     00:00
Dec  6 04:23:50 np0005548916 dnf[30838]: repo-setup-centos-highavailability               36 MB/s | 744 kB     00:00
Dec  6 04:23:50 np0005548916 dnf[30838]: repo-setup-centos-powertools                     76 MB/s | 7.3 MB     00:00
Dec  6 04:24:03 np0005548916 dnf[30838]: Extra Packages for Enterprise Linux 9 - x86_64  1.8 MB/s |  20 MB     00:11
Dec  6 04:24:23 np0005548916 dnf[30838]: Metadata cache created.
Dec  6 04:24:23 np0005548916 systemd[1]: dnf-makecache.service: Deactivated successfully.
Dec  6 04:24:23 np0005548916 systemd[1]: Finished dnf makecache.
Dec  6 04:24:23 np0005548916 systemd[1]: dnf-makecache.service: Consumed 31.729s CPU time.
Dec  6 04:27:56 np0005548916 systemd-logind[788]: New session 9 of user zuul.
Dec  6 04:27:56 np0005548916 systemd[1]: Started Session 9 of User zuul.
Dec  6 04:27:57 np0005548916 python3.9[31097]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  6 04:27:59 np0005548916 python3.9[31278]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail#012pushd /var/tmp#012curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz#012pushd repo-setup-main#012python3 -m venv ./venv#012PBR_VERSION=0.0.0 ./venv/bin/pip install ./#012./venv/bin/repo-setup current-podified -b antelope#012popd#012rm -rf repo-setup-main#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:28:06 np0005548916 systemd[1]: session-9.scope: Deactivated successfully.
Dec  6 04:28:06 np0005548916 systemd[1]: session-9.scope: Consumed 8.417s CPU time.
Dec  6 04:28:06 np0005548916 systemd-logind[788]: Session 9 logged out. Waiting for processes to exit.
Dec  6 04:28:06 np0005548916 systemd-logind[788]: Removed session 9.
Dec  6 04:28:22 np0005548916 systemd-logind[788]: New session 10 of user zuul.
Dec  6 04:28:22 np0005548916 systemd[1]: Started Session 10 of User zuul.
Dec  6 04:28:22 np0005548916 python3.9[31488]: ansible-ansible.legacy.ping Invoked with data=pong
Dec  6 04:28:24 np0005548916 python3.9[31662]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  6 04:28:25 np0005548916 python3.9[31814]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:28:26 np0005548916 python3.9[31967]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  6 04:28:27 np0005548916 python3.9[32119]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:28:27 np0005548916 python3.9[32271]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:28:28 np0005548916 python3.9[32394]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1765013307.3222613-178-89372333289099/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:28:29 np0005548916 python3.9[32546]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  6 04:28:30 np0005548916 python3.9[32702]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:28:31 np0005548916 python3.9[32854]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:28:32 np0005548916 python3.9[33004]: ansible-ansible.builtin.service_facts Invoked
Dec  6 04:28:39 np0005548916 python3.9[33258]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:28:40 np0005548916 python3.9[33408]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  6 04:28:41 np0005548916 python3.9[33562]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  6 04:28:42 np0005548916 python3.9[33720]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  6 04:28:43 np0005548916 python3.9[33804]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  6 04:29:34 np0005548916 systemd[1]: Reloading.
Dec  6 04:29:34 np0005548916 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:29:34 np0005548916 systemd[1]: Listening on Device-mapper event daemon FIFOs.
Dec  6 04:29:35 np0005548916 systemd[1]: Reloading.
Dec  6 04:29:35 np0005548916 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:29:35 np0005548916 systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling...
Dec  6 04:29:35 np0005548916 systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling.
Dec  6 04:29:35 np0005548916 systemd[1]: Reloading.
Dec  6 04:29:35 np0005548916 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:29:35 np0005548916 systemd[1]: Listening on LVM2 poll daemon socket.
Dec  6 04:29:35 np0005548916 dbus-broker-launch[770]: Noticed file-system modification, trigger reload.
Dec  6 04:29:35 np0005548916 dbus-broker-launch[770]: Noticed file-system modification, trigger reload.
Dec  6 04:29:35 np0005548916 dbus-broker-launch[770]: Noticed file-system modification, trigger reload.
Dec  6 04:30:43 np0005548916 kernel: SELinux:  Converting 2719 SID table entries...
Dec  6 04:30:43 np0005548916 kernel: SELinux:  policy capability network_peer_controls=1
Dec  6 04:30:43 np0005548916 kernel: SELinux:  policy capability open_perms=1
Dec  6 04:30:43 np0005548916 kernel: SELinux:  policy capability extended_socket_class=1
Dec  6 04:30:43 np0005548916 kernel: SELinux:  policy capability always_check_network=0
Dec  6 04:30:43 np0005548916 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec  6 04:30:43 np0005548916 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec  6 04:30:43 np0005548916 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec  6 04:30:43 np0005548916 dbus-broker-launch[774]: avc:  op=load_policy lsm=selinux seqno=8 res=1
Dec  6 04:30:44 np0005548916 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec  6 04:30:44 np0005548916 systemd[1]: Starting man-db-cache-update.service...
Dec  6 04:30:44 np0005548916 systemd[1]: Reloading.
Dec  6 04:30:44 np0005548916 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:30:44 np0005548916 systemd[1]: Queuing reload/restart jobs for marked units…
Dec  6 04:30:45 np0005548916 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec  6 04:30:45 np0005548916 systemd[1]: Finished man-db-cache-update.service.
Dec  6 04:30:45 np0005548916 systemd[1]: man-db-cache-update.service: Consumed 1.336s CPU time.
Dec  6 04:30:45 np0005548916 systemd[1]: run-r37da2513ed4f420f90551d7aff076297.service: Deactivated successfully.
Dec  6 04:30:45 np0005548916 python3.9[35332]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:30:48 np0005548916 python3.9[35613]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Dec  6 04:30:49 np0005548916 python3.9[35765]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Dec  6 04:30:54 np0005548916 python3.9[35918]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:30:56 np0005548916 python3.9[36070]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Dec  6 04:31:04 np0005548916 python3.9[36222]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:31:05 np0005548916 python3.9[36374]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:31:05 np0005548916 python3.9[36497]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013464.7308097-667-271352270742988/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=22c202a539af259b977a1afda61dbc1fe0d1039c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:31:10 np0005548916 python3.9[36649]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  6 04:31:11 np0005548916 python3.9[36801]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/vgimportdevices --all _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:31:12 np0005548916 python3.9[36954]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/lvm/devices/system.devices state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:31:14 np0005548916 python3.9[37106]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Dec  6 04:31:15 np0005548916 python3.9[37259]: ansible-ansible.builtin.group Invoked with gid=107 name=qemu state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec  6 04:31:15 np0005548916 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec  6 04:31:16 np0005548916 python3.9[37418]: ansible-ansible.builtin.user Invoked with comment=qemu user group=qemu groups=[''] name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Dec  6 04:31:17 np0005548916 python3.9[37578]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Dec  6 04:31:17 np0005548916 python3.9[37731]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec  6 04:31:18 np0005548916 python3.9[37889]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Dec  6 04:31:20 np0005548916 python3.9[38041]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  6 04:31:23 np0005548916 python3.9[38194]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:31:24 np0005548916 python3.9[38346]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:31:24 np0005548916 python3.9[38469]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765013483.6196544-1024-183795242943665/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:31:26 np0005548916 python3.9[38621]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  6 04:31:26 np0005548916 systemd[1]: Starting Load Kernel Modules...
Dec  6 04:31:26 np0005548916 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this.
Dec  6 04:31:26 np0005548916 kernel: Bridge firewalling registered
Dec  6 04:31:26 np0005548916 systemd-modules-load[38625]: Inserted module 'br_netfilter'
Dec  6 04:31:26 np0005548916 systemd[1]: Finished Load Kernel Modules.
Dec  6 04:31:27 np0005548916 python3.9[38780]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:31:27 np0005548916 python3.9[38903]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765013486.4231775-1094-53405985451923/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:31:29 np0005548916 python3.9[39055]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  6 04:31:34 np0005548916 dbus-broker-launch[770]: Noticed file-system modification, trigger reload.
Dec  6 04:31:34 np0005548916 dbus-broker-launch[770]: Noticed file-system modification, trigger reload.
Dec  6 04:31:35 np0005548916 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec  6 04:31:35 np0005548916 systemd[1]: Starting man-db-cache-update.service...
Dec  6 04:31:35 np0005548916 systemd[1]: Reloading.
Dec  6 04:31:35 np0005548916 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:31:35 np0005548916 systemd[1]: Queuing reload/restart jobs for marked units…
Dec  6 04:31:37 np0005548916 python3.9[40396]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  6 04:31:38 np0005548916 python3.9[41231]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Dec  6 04:31:38 np0005548916 python3.9[41968]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  6 04:31:39 np0005548916 python3.9[43042]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/tuned-adm profile throughput-performance _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:31:40 np0005548916 systemd[1]: Starting Dynamic System Tuning Daemon...
Dec  6 04:31:40 np0005548916 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec  6 04:31:40 np0005548916 systemd[1]: Finished man-db-cache-update.service.
Dec  6 04:31:40 np0005548916 systemd[1]: man-db-cache-update.service: Consumed 5.841s CPU time.
Dec  6 04:31:40 np0005548916 systemd[1]: run-r9724e0f38af84b0a8264b851209db721.service: Deactivated successfully.
Dec  6 04:31:40 np0005548916 systemd[1]: Starting Authorization Manager...
Dec  6 04:31:40 np0005548916 systemd[1]: Started Dynamic System Tuning Daemon.
Dec  6 04:31:40 np0005548916 polkitd[43440]: Started polkitd version 0.117
Dec  6 04:31:40 np0005548916 systemd[1]: Started Authorization Manager.
Dec  6 04:31:41 np0005548916 python3.9[43610]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  6 04:31:41 np0005548916 systemd[1]: Stopping Dynamic System Tuning Daemon...
Dec  6 04:31:41 np0005548916 systemd[1]: tuned.service: Deactivated successfully.
Dec  6 04:31:41 np0005548916 systemd[1]: Stopped Dynamic System Tuning Daemon.
Dec  6 04:31:41 np0005548916 systemd[1]: Starting Dynamic System Tuning Daemon...
Dec  6 04:31:42 np0005548916 systemd[1]: Started Dynamic System Tuning Daemon.
Dec  6 04:31:43 np0005548916 python3.9[43772]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Dec  6 04:31:46 np0005548916 python3.9[43924]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  6 04:31:46 np0005548916 systemd[1]: Reloading.
Dec  6 04:31:47 np0005548916 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:31:47 np0005548916 python3.9[44114]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  6 04:31:47 np0005548916 systemd[1]: Reloading.
Dec  6 04:31:48 np0005548916 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:31:49 np0005548916 python3.9[44304]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:31:49 np0005548916 python3.9[44457]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:31:49 np0005548916 kernel: Adding 1048572k swap on /swap.  Priority:-2 extents:1 across:1048572k 
Dec  6 04:31:50 np0005548916 python3.9[44610]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:31:53 np0005548916 python3.9[44772]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:31:54 np0005548916 python3.9[44925]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  6 04:31:54 np0005548916 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Dec  6 04:31:54 np0005548916 systemd[1]: Stopped Apply Kernel Variables.
Dec  6 04:31:54 np0005548916 systemd[1]: Stopping Apply Kernel Variables...
Dec  6 04:31:54 np0005548916 systemd[1]: Starting Apply Kernel Variables...
Dec  6 04:31:54 np0005548916 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Dec  6 04:31:54 np0005548916 systemd[1]: Finished Apply Kernel Variables.
Dec  6 04:31:54 np0005548916 systemd[1]: session-10.scope: Deactivated successfully.
Dec  6 04:31:54 np0005548916 systemd[1]: session-10.scope: Consumed 2min 29.219s CPU time.
Dec  6 04:31:54 np0005548916 systemd-logind[788]: Session 10 logged out. Waiting for processes to exit.
Dec  6 04:31:54 np0005548916 systemd-logind[788]: Removed session 10.
Dec  6 04:32:00 np0005548916 systemd-logind[788]: New session 11 of user zuul.
Dec  6 04:32:00 np0005548916 systemd[1]: Started Session 11 of User zuul.
Dec  6 04:32:01 np0005548916 python3.9[45108]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  6 04:32:02 np0005548916 python3.9[45264]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Dec  6 04:32:03 np0005548916 python3.9[45417]: ansible-ansible.builtin.group Invoked with gid=42476 name=openvswitch state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec  6 04:32:05 np0005548916 python3.9[45575]: ansible-ansible.builtin.user Invoked with comment=openvswitch user group=openvswitch groups=['hugetlbfs'] name=openvswitch shell=/sbin/nologin state=present uid=42476 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Dec  6 04:32:06 np0005548916 python3.9[45735]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  6 04:32:07 np0005548916 python3.9[45819]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec  6 04:32:14 np0005548916 python3.9[45983]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  6 04:32:28 np0005548916 kernel: SELinux:  Converting 2731 SID table entries...
Dec  6 04:32:28 np0005548916 kernel: SELinux:  policy capability network_peer_controls=1
Dec  6 04:32:28 np0005548916 kernel: SELinux:  policy capability open_perms=1
Dec  6 04:32:28 np0005548916 kernel: SELinux:  policy capability extended_socket_class=1
Dec  6 04:32:28 np0005548916 kernel: SELinux:  policy capability always_check_network=0
Dec  6 04:32:28 np0005548916 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec  6 04:32:28 np0005548916 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec  6 04:32:28 np0005548916 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec  6 04:32:29 np0005548916 dbus-broker-launch[774]: avc:  op=load_policy lsm=selinux seqno=9 res=1
Dec  6 04:32:29 np0005548916 systemd[1]: Started daily update of the root trust anchor for DNSSEC.
Dec  6 04:32:30 np0005548916 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec  6 04:32:30 np0005548916 systemd[1]: Starting man-db-cache-update.service...
Dec  6 04:32:30 np0005548916 systemd[1]: Reloading.
Dec  6 04:32:30 np0005548916 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:32:30 np0005548916 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:32:31 np0005548916 systemd[1]: Queuing reload/restart jobs for marked units…
Dec  6 04:32:31 np0005548916 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec  6 04:32:31 np0005548916 systemd[1]: Finished man-db-cache-update.service.
Dec  6 04:32:31 np0005548916 systemd[1]: man-db-cache-update.service: Consumed 1.018s CPU time.
Dec  6 04:32:31 np0005548916 systemd[1]: run-rc8ed9a019b1e4f9f982ba30e3b926f38.service: Deactivated successfully.
Dec  6 04:32:32 np0005548916 python3.9[47082]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec  6 04:32:32 np0005548916 systemd[1]: Reloading.
Dec  6 04:32:33 np0005548916 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:32:33 np0005548916 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:32:33 np0005548916 systemd[1]: Starting Open vSwitch Database Unit...
Dec  6 04:32:33 np0005548916 chown[47124]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory
Dec  6 04:32:33 np0005548916 ovs-ctl[47129]: /etc/openvswitch/conf.db does not exist ... (warning).
Dec  6 04:32:33 np0005548916 ovs-ctl[47129]: Creating empty database /etc/openvswitch/conf.db [  OK  ]
Dec  6 04:32:33 np0005548916 ovs-ctl[47129]: Starting ovsdb-server [  OK  ]
Dec  6 04:32:33 np0005548916 ovs-vsctl[47178]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1
Dec  6 04:32:33 np0005548916 ovs-vsctl[47198]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.5-115.el9s "external-ids:system-id=\"61eba479-a995-4b31-88b9-8ebfcea9907e\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"centos\"" "system-version=\"9\""
Dec  6 04:32:33 np0005548916 ovs-ctl[47129]: Configuring Open vSwitch system IDs [  OK  ]
Dec  6 04:32:33 np0005548916 ovs-ctl[47129]: Enabling remote OVSDB managers [  OK  ]
Dec  6 04:32:33 np0005548916 ovs-vsctl[47204]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-1
Dec  6 04:32:33 np0005548916 systemd[1]: Started Open vSwitch Database Unit.
Dec  6 04:32:33 np0005548916 systemd[1]: Starting Open vSwitch Delete Transient Ports...
Dec  6 04:32:33 np0005548916 systemd[1]: Finished Open vSwitch Delete Transient Ports.
Dec  6 04:32:33 np0005548916 systemd[1]: Starting Open vSwitch Forwarding Unit...
Dec  6 04:32:33 np0005548916 kernel: openvswitch: Open vSwitch switching datapath
Dec  6 04:32:33 np0005548916 ovs-ctl[47248]: Inserting openvswitch module [  OK  ]
Dec  6 04:32:34 np0005548916 ovs-ctl[47217]: Starting ovs-vswitchd [  OK  ]
Dec  6 04:32:34 np0005548916 ovs-vsctl[47265]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-1
Dec  6 04:32:34 np0005548916 ovs-ctl[47217]: Enabling remote OVSDB managers [  OK  ]
Dec  6 04:32:34 np0005548916 systemd[1]: Started Open vSwitch Forwarding Unit.
Dec  6 04:32:34 np0005548916 systemd[1]: Starting Open vSwitch...
Dec  6 04:32:34 np0005548916 systemd[1]: Finished Open vSwitch.
Dec  6 04:32:36 np0005548916 python3.9[47417]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  6 04:32:37 np0005548916 python3.9[47569]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Dec  6 04:32:38 np0005548916 kernel: SELinux:  Converting 2745 SID table entries...
Dec  6 04:32:38 np0005548916 kernel: SELinux:  policy capability network_peer_controls=1
Dec  6 04:32:38 np0005548916 kernel: SELinux:  policy capability open_perms=1
Dec  6 04:32:38 np0005548916 kernel: SELinux:  policy capability extended_socket_class=1
Dec  6 04:32:38 np0005548916 kernel: SELinux:  policy capability always_check_network=0
Dec  6 04:32:38 np0005548916 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec  6 04:32:38 np0005548916 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec  6 04:32:38 np0005548916 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec  6 04:32:39 np0005548916 python3.9[47724]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  6 04:32:40 np0005548916 dbus-broker-launch[774]: avc:  op=load_policy lsm=selinux seqno=10 res=1
Dec  6 04:32:40 np0005548916 python3.9[47882]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  6 04:32:43 np0005548916 python3.9[48035]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:32:45 np0005548916 python3.9[48322]: ansible-ansible.builtin.file Invoked with mode=0750 path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Dec  6 04:32:46 np0005548916 python3.9[48472]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  6 04:32:47 np0005548916 python3.9[48626]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  6 04:32:49 np0005548916 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec  6 04:32:49 np0005548916 systemd[1]: Starting man-db-cache-update.service...
Dec  6 04:32:49 np0005548916 systemd[1]: Reloading.
Dec  6 04:32:49 np0005548916 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:32:49 np0005548916 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:32:50 np0005548916 systemd[1]: Queuing reload/restart jobs for marked units…
Dec  6 04:32:50 np0005548916 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec  6 04:32:50 np0005548916 systemd[1]: Finished man-db-cache-update.service.
Dec  6 04:32:50 np0005548916 systemd[1]: run-r78981d88ca3449c6b061ca9f5496fc00.service: Deactivated successfully.
Dec  6 04:32:51 np0005548916 python3.9[48943]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  6 04:32:51 np0005548916 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Dec  6 04:32:51 np0005548916 systemd[1]: Stopped Network Manager Wait Online.
Dec  6 04:32:51 np0005548916 systemd[1]: Stopping Network Manager Wait Online...
Dec  6 04:32:51 np0005548916 systemd[1]: Stopping Network Manager...
Dec  6 04:32:51 np0005548916 NetworkManager[7209]: <info>  [1765013571.6832] caught SIGTERM, shutting down normally.
Dec  6 04:32:51 np0005548916 NetworkManager[7209]: <info>  [1765013571.6858] dhcp4 (eth0): canceled DHCP transaction
Dec  6 04:32:51 np0005548916 NetworkManager[7209]: <info>  [1765013571.6859] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec  6 04:32:51 np0005548916 NetworkManager[7209]: <info>  [1765013571.6859] dhcp4 (eth0): state changed no lease
Dec  6 04:32:51 np0005548916 NetworkManager[7209]: <info>  [1765013571.6861] manager: NetworkManager state is now CONNECTED_SITE
Dec  6 04:32:51 np0005548916 NetworkManager[7209]: <info>  [1765013571.6969] exiting (success)
Dec  6 04:32:51 np0005548916 systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec  6 04:32:51 np0005548916 systemd[1]: Started Network Manager Script Dispatcher Service.
Dec  6 04:32:51 np0005548916 systemd[1]: NetworkManager.service: Deactivated successfully.
Dec  6 04:32:51 np0005548916 systemd[1]: Stopped Network Manager.
Dec  6 04:32:51 np0005548916 systemd[1]: NetworkManager.service: Consumed 11.988s CPU time, 4.2M memory peak, read 0B from disk, written 27.0K to disk.
Dec  6 04:32:51 np0005548916 systemd[1]: Starting Network Manager...
Dec  6 04:32:51 np0005548916 NetworkManager[48956]: <info>  [1765013571.7768] NetworkManager (version 1.54.1-1.el9) is starting... (after a restart, boot:27715b31-3399-4bbf-a0fa-54836c80918e)
Dec  6 04:32:51 np0005548916 NetworkManager[48956]: <info>  [1765013571.7770] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Dec  6 04:32:51 np0005548916 NetworkManager[48956]: <info>  [1765013571.7822] manager[0x557774aa6090]: monitoring kernel firmware directory '/lib/firmware'.
Dec  6 04:32:51 np0005548916 systemd[1]: Starting Hostname Service...
Dec  6 04:32:51 np0005548916 systemd[1]: Started Hostname Service.
Dec  6 04:32:51 np0005548916 NetworkManager[48956]: <info>  [1765013571.8877] hostname: hostname: using hostnamed
Dec  6 04:32:51 np0005548916 NetworkManager[48956]: <info>  [1765013571.8878] hostname: static hostname changed from (none) to "compute-1"
Dec  6 04:32:51 np0005548916 NetworkManager[48956]: <info>  [1765013571.8884] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Dec  6 04:32:51 np0005548916 NetworkManager[48956]: <info>  [1765013571.8890] manager[0x557774aa6090]: rfkill: Wi-Fi hardware radio set enabled
Dec  6 04:32:51 np0005548916 NetworkManager[48956]: <info>  [1765013571.8890] manager[0x557774aa6090]: rfkill: WWAN hardware radio set enabled
Dec  6 04:32:51 np0005548916 NetworkManager[48956]: <info>  [1765013571.8915] Loaded device plugin: NMOvsFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-ovs.so)
Dec  6 04:32:51 np0005548916 NetworkManager[48956]: <info>  [1765013571.8924] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Dec  6 04:32:51 np0005548916 NetworkManager[48956]: <info>  [1765013571.8925] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Dec  6 04:32:51 np0005548916 NetworkManager[48956]: <info>  [1765013571.8925] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Dec  6 04:32:51 np0005548916 NetworkManager[48956]: <info>  [1765013571.8926] manager: Networking is enabled by state file
Dec  6 04:32:51 np0005548916 NetworkManager[48956]: <info>  [1765013571.8929] settings: Loaded settings plugin: keyfile (internal)
Dec  6 04:32:51 np0005548916 NetworkManager[48956]: <info>  [1765013571.8933] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Dec  6 04:32:51 np0005548916 NetworkManager[48956]: <info>  [1765013571.8985] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Dec  6 04:32:51 np0005548916 NetworkManager[48956]: <info>  [1765013571.9002] dhcp: init: Using DHCP client 'internal'
Dec  6 04:32:51 np0005548916 NetworkManager[48956]: <info>  [1765013571.9006] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Dec  6 04:32:51 np0005548916 NetworkManager[48956]: <info>  [1765013571.9015] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  6 04:32:51 np0005548916 NetworkManager[48956]: <info>  [1765013571.9024] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Dec  6 04:32:51 np0005548916 NetworkManager[48956]: <info>  [1765013571.9035] device (lo): Activation: starting connection 'lo' (04d45710-56f6-4696-9924-dd30b84bf74f)
Dec  6 04:32:51 np0005548916 NetworkManager[48956]: <info>  [1765013571.9045] device (eth0): carrier: link connected
Dec  6 04:32:51 np0005548916 NetworkManager[48956]: <info>  [1765013571.9050] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Dec  6 04:32:51 np0005548916 NetworkManager[48956]: <info>  [1765013571.9059] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Dec  6 04:32:51 np0005548916 NetworkManager[48956]: <info>  [1765013571.9060] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Dec  6 04:32:51 np0005548916 NetworkManager[48956]: <info>  [1765013571.9069] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Dec  6 04:32:51 np0005548916 NetworkManager[48956]: <info>  [1765013571.9080] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Dec  6 04:32:51 np0005548916 NetworkManager[48956]: <info>  [1765013571.9088] device (eth1): carrier: link connected
Dec  6 04:32:51 np0005548916 NetworkManager[48956]: <info>  [1765013571.9092] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Dec  6 04:32:51 np0005548916 NetworkManager[48956]: <info>  [1765013571.9101] manager: (eth1): assume: will attempt to assume matching connection 'ci-private-network' (f3fb407f-d9e1-5507-a7f7-856240ad9666) (indicated)
Dec  6 04:32:51 np0005548916 NetworkManager[48956]: <info>  [1765013571.9102] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Dec  6 04:32:51 np0005548916 NetworkManager[48956]: <info>  [1765013571.9109] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Dec  6 04:32:51 np0005548916 NetworkManager[48956]: <info>  [1765013571.9117] device (eth1): Activation: starting connection 'ci-private-network' (f3fb407f-d9e1-5507-a7f7-856240ad9666)
Dec  6 04:32:51 np0005548916 systemd[1]: Started Network Manager.
Dec  6 04:32:51 np0005548916 NetworkManager[48956]: <info>  [1765013571.9131] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Dec  6 04:32:51 np0005548916 NetworkManager[48956]: <info>  [1765013571.9145] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Dec  6 04:32:51 np0005548916 NetworkManager[48956]: <info>  [1765013571.9148] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Dec  6 04:32:51 np0005548916 NetworkManager[48956]: <info>  [1765013571.9150] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Dec  6 04:32:51 np0005548916 NetworkManager[48956]: <info>  [1765013571.9152] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Dec  6 04:32:51 np0005548916 NetworkManager[48956]: <info>  [1765013571.9155] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Dec  6 04:32:51 np0005548916 NetworkManager[48956]: <info>  [1765013571.9158] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Dec  6 04:32:51 np0005548916 NetworkManager[48956]: <info>  [1765013571.9161] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Dec  6 04:32:51 np0005548916 NetworkManager[48956]: <info>  [1765013571.9166] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Dec  6 04:32:51 np0005548916 NetworkManager[48956]: <info>  [1765013571.9173] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Dec  6 04:32:51 np0005548916 NetworkManager[48956]: <info>  [1765013571.9176] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec  6 04:32:51 np0005548916 NetworkManager[48956]: <info>  [1765013571.9185] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Dec  6 04:32:51 np0005548916 NetworkManager[48956]: <info>  [1765013571.9197] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Dec  6 04:32:51 np0005548916 NetworkManager[48956]: <info>  [1765013571.9216] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Dec  6 04:32:51 np0005548916 NetworkManager[48956]: <info>  [1765013571.9218] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Dec  6 04:32:51 np0005548916 NetworkManager[48956]: <info>  [1765013571.9224] device (lo): Activation: successful, device activated.
Dec  6 04:32:51 np0005548916 NetworkManager[48956]: <info>  [1765013571.9233] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Dec  6 04:32:51 np0005548916 NetworkManager[48956]: <info>  [1765013571.9235] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Dec  6 04:32:51 np0005548916 NetworkManager[48956]: <info>  [1765013571.9238] manager: NetworkManager state is now CONNECTED_LOCAL
Dec  6 04:32:51 np0005548916 NetworkManager[48956]: <info>  [1765013571.9242] device (eth1): Activation: successful, device activated.
Dec  6 04:32:51 np0005548916 systemd[1]: Starting Network Manager Wait Online...
Dec  6 04:32:51 np0005548916 NetworkManager[48956]: <info>  [1765013571.9448] dhcp4 (eth0): state changed new lease, address=38.102.83.113
Dec  6 04:32:51 np0005548916 NetworkManager[48956]: <info>  [1765013571.9457] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Dec  6 04:32:51 np0005548916 NetworkManager[48956]: <info>  [1765013571.9536] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Dec  6 04:32:51 np0005548916 NetworkManager[48956]: <info>  [1765013571.9559] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Dec  6 04:32:51 np0005548916 NetworkManager[48956]: <info>  [1765013571.9561] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Dec  6 04:32:51 np0005548916 NetworkManager[48956]: <info>  [1765013571.9565] manager: NetworkManager state is now CONNECTED_SITE
Dec  6 04:32:51 np0005548916 NetworkManager[48956]: <info>  [1765013571.9569] device (eth0): Activation: successful, device activated.
Dec  6 04:32:51 np0005548916 NetworkManager[48956]: <info>  [1765013571.9574] manager: NetworkManager state is now CONNECTED_GLOBAL
Dec  6 04:32:51 np0005548916 NetworkManager[48956]: <info>  [1765013571.9577] manager: startup complete
Dec  6 04:32:51 np0005548916 systemd[1]: Finished Network Manager Wait Online.
Dec  6 04:32:52 np0005548916 python3.9[49169]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  6 04:32:58 np0005548916 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec  6 04:32:58 np0005548916 systemd[1]: Starting man-db-cache-update.service...
Dec  6 04:32:58 np0005548916 systemd[1]: Reloading.
Dec  6 04:32:58 np0005548916 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:32:58 np0005548916 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:32:58 np0005548916 systemd[1]: Queuing reload/restart jobs for marked units…
Dec  6 04:33:00 np0005548916 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec  6 04:33:00 np0005548916 systemd[1]: Finished man-db-cache-update.service.
Dec  6 04:33:00 np0005548916 systemd[1]: run-rcccf55638ed2426f8a5e5075e64f5b0b.service: Deactivated successfully.
Dec  6 04:33:01 np0005548916 python3.9[49629]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  6 04:33:02 np0005548916 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec  6 04:33:02 np0005548916 python3.9[49782]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:33:03 np0005548916 python3.9[49936]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:33:04 np0005548916 python3.9[50088]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:33:04 np0005548916 python3.9[50240]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:33:05 np0005548916 python3.9[50392]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:33:06 np0005548916 python3.9[50544]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:33:07 np0005548916 python3.9[50667]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1765013585.8883102-648-4839093789921/.source _original_basename=.edt9k9cb follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:33:07 np0005548916 python3.9[50819]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:33:08 np0005548916 python3.9[50971]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={}
Dec  6 04:33:09 np0005548916 python3.9[51123]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:33:12 np0005548916 python3.9[51550]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml
Dec  6 04:33:13 np0005548916 ansible-async_wrapper.py[51725]: Invoked with j948182577153 300 /home/zuul/.ansible/tmp/ansible-tmp-1765013592.6580362-846-268955869646410/AnsiballZ_edpm_os_net_config.py _
Dec  6 04:33:13 np0005548916 ansible-async_wrapper.py[51728]: Starting module and watcher
Dec  6 04:33:13 np0005548916 ansible-async_wrapper.py[51728]: Start watching 51729 (300)
Dec  6 04:33:13 np0005548916 ansible-async_wrapper.py[51729]: Start module (51729)
Dec  6 04:33:13 np0005548916 ansible-async_wrapper.py[51725]: Return async_wrapper task started.
Dec  6 04:33:13 np0005548916 python3.9[51730]: ansible-edpm_os_net_config Invoked with cleanup=True config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True safe_defaults=False use_nmstate=True
Dec  6 04:33:14 np0005548916 kernel: cfg80211: Loading compiled-in X.509 certificates for regulatory database
Dec  6 04:33:14 np0005548916 kernel: Loaded X.509 cert 'sforshee: 00b28ddf47aef9cea7'
Dec  6 04:33:14 np0005548916 kernel: Loaded X.509 cert 'wens: 61c038651aabdcf94bd0ac7ff06c7248db18c600'
Dec  6 04:33:14 np0005548916 kernel: platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
Dec  6 04:33:14 np0005548916 kernel: cfg80211: failed to load regulatory.db
Dec  6 04:33:15 np0005548916 NetworkManager[48956]: <info>  [1765013595.8688] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51731 uid=0 result="success"
Dec  6 04:33:15 np0005548916 NetworkManager[48956]: <info>  [1765013595.8711] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51731 uid=0 result="success"
Dec  6 04:33:15 np0005548916 NetworkManager[48956]: <info>  [1765013595.9320] manager: (br-ex): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/4)
Dec  6 04:33:15 np0005548916 NetworkManager[48956]: <info>  [1765013595.9321] audit: op="connection-add" uuid="d45d4ee4-6865-4bc9-8f68-2364ae6474e8" name="br-ex-br" pid=51731 uid=0 result="success"
Dec  6 04:33:15 np0005548916 NetworkManager[48956]: <info>  [1765013595.9343] manager: (br-ex): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/5)
Dec  6 04:33:15 np0005548916 NetworkManager[48956]: <info>  [1765013595.9344] audit: op="connection-add" uuid="2332240b-855a-4c20-952f-a49148c1f030" name="br-ex-port" pid=51731 uid=0 result="success"
Dec  6 04:33:15 np0005548916 NetworkManager[48956]: <info>  [1765013595.9366] manager: (eth1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/6)
Dec  6 04:33:15 np0005548916 NetworkManager[48956]: <info>  [1765013595.9367] audit: op="connection-add" uuid="f72a5f18-8164-4e84-81af-63ac70cda19e" name="eth1-port" pid=51731 uid=0 result="success"
Dec  6 04:33:15 np0005548916 NetworkManager[48956]: <info>  [1765013595.9385] manager: (vlan20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/7)
Dec  6 04:33:15 np0005548916 NetworkManager[48956]: <info>  [1765013595.9387] audit: op="connection-add" uuid="06560179-8f21-4840-89f1-e305670ae13b" name="vlan20-port" pid=51731 uid=0 result="success"
Dec  6 04:33:15 np0005548916 NetworkManager[48956]: <info>  [1765013595.9404] manager: (vlan21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/8)
Dec  6 04:33:15 np0005548916 NetworkManager[48956]: <info>  [1765013595.9406] audit: op="connection-add" uuid="e78382ba-8d43-4538-9f10-314df9dad09b" name="vlan21-port" pid=51731 uid=0 result="success"
Dec  6 04:33:15 np0005548916 NetworkManager[48956]: <info>  [1765013595.9425] manager: (vlan22): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/9)
Dec  6 04:33:15 np0005548916 NetworkManager[48956]: <info>  [1765013595.9427] audit: op="connection-add" uuid="cbbe476e-61dc-48d9-a4bc-3925a8944b42" name="vlan22-port" pid=51731 uid=0 result="success"
Dec  6 04:33:15 np0005548916 NetworkManager[48956]: <info>  [1765013595.9445] manager: (vlan23): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/10)
Dec  6 04:33:15 np0005548916 NetworkManager[48956]: <info>  [1765013595.9446] audit: op="connection-add" uuid="7516e499-11e3-42d1-a6da-28b443cf8217" name="vlan23-port" pid=51731 uid=0 result="success"
Dec  6 04:33:15 np0005548916 NetworkManager[48956]: <info>  [1765013595.9474] audit: op="connection-update" uuid="5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03" name="System eth0" args="connection.autoconnect-priority,connection.timestamp,ipv4.dhcp-client-id,ipv4.dhcp-timeout,ipv6.method,ipv6.dhcp-timeout,ipv6.addr-gen-mode,802-3-ethernet.mtu" pid=51731 uid=0 result="success"
Dec  6 04:33:15 np0005548916 NetworkManager[48956]: <info>  [1765013595.9498] manager: (br-ex): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/11)
Dec  6 04:33:15 np0005548916 NetworkManager[48956]: <info>  [1765013595.9499] audit: op="connection-add" uuid="2a8ba0ca-93e0-4e84-96a1-eb2bf4feb098" name="br-ex-if" pid=51731 uid=0 result="success"
Dec  6 04:33:15 np0005548916 NetworkManager[48956]: <info>  [1765013595.9826] audit: op="connection-update" uuid="f3fb407f-d9e1-5507-a7f7-856240ad9666" name="ci-private-network" args="connection.controller,connection.master,connection.slave-type,connection.port-type,connection.timestamp,ovs-external-ids.data,ovs-interface.type,ipv4.method,ipv4.never-default,ipv4.routes,ipv4.dns,ipv4.routing-rules,ipv4.addresses,ipv6.method,ipv6.routes,ipv6.addr-gen-mode,ipv6.dns,ipv6.routing-rules,ipv6.addresses" pid=51731 uid=0 result="success"
Dec  6 04:33:15 np0005548916 NetworkManager[48956]: <info>  [1765013595.9852] manager: (vlan20): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/12)
Dec  6 04:33:15 np0005548916 NetworkManager[48956]: <info>  [1765013595.9856] audit: op="connection-add" uuid="8c1c8581-b0c3-4a63-9c77-9fbc5756bf30" name="vlan20-if" pid=51731 uid=0 result="success"
Dec  6 04:33:15 np0005548916 NetworkManager[48956]: <info>  [1765013595.9879] manager: (vlan21): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/13)
Dec  6 04:33:15 np0005548916 NetworkManager[48956]: <info>  [1765013595.9881] audit: op="connection-add" uuid="844cc38d-873d-41cd-b14e-0e20f1031e80" name="vlan21-if" pid=51731 uid=0 result="success"
Dec  6 04:33:15 np0005548916 NetworkManager[48956]: <info>  [1765013595.9905] manager: (vlan22): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/14)
Dec  6 04:33:15 np0005548916 NetworkManager[48956]: <info>  [1765013595.9906] audit: op="connection-add" uuid="68b52fa8-2c3f-400a-aa45-d937eefe44a1" name="vlan22-if" pid=51731 uid=0 result="success"
Dec  6 04:33:15 np0005548916 NetworkManager[48956]: <info>  [1765013595.9935] manager: (vlan23): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/15)
Dec  6 04:33:15 np0005548916 NetworkManager[48956]: <info>  [1765013595.9937] audit: op="connection-add" uuid="4184a6b0-b7b9-414f-8c5e-d7c69e6b028e" name="vlan23-if" pid=51731 uid=0 result="success"
Dec  6 04:33:15 np0005548916 NetworkManager[48956]: <info>  [1765013595.9956] audit: op="connection-delete" uuid="d0a7d597-e5ec-3c93-9ea9-45506a05a0f2" name="Wired connection 1" pid=51731 uid=0 result="success"
Dec  6 04:33:15 np0005548916 NetworkManager[48956]: <info>  [1765013595.9973] device (br-ex)[Open vSwitch Bridge]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  6 04:33:15 np0005548916 NetworkManager[48956]: <info>  [1765013595.9985] device (br-ex)[Open vSwitch Bridge]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec  6 04:33:15 np0005548916 NetworkManager[48956]: <info>  [1765013595.9992] device (br-ex)[Open vSwitch Bridge]: Activation: starting connection 'br-ex-br' (d45d4ee4-6865-4bc9-8f68-2364ae6474e8)
Dec  6 04:33:15 np0005548916 NetworkManager[48956]: <info>  [1765013595.9993] audit: op="connection-activate" uuid="d45d4ee4-6865-4bc9-8f68-2364ae6474e8" name="br-ex-br" pid=51731 uid=0 result="success"
Dec  6 04:33:15 np0005548916 NetworkManager[48956]: <info>  [1765013595.9995] device (br-ex)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0004] device (br-ex)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0010] device (br-ex)[Open vSwitch Port]: Activation: starting connection 'br-ex-port' (2332240b-855a-4c20-952f-a49148c1f030)
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0012] device (eth1)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0019] device (eth1)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0025] device (eth1)[Open vSwitch Port]: Activation: starting connection 'eth1-port' (f72a5f18-8164-4e84-81af-63ac70cda19e)
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0026] device (vlan20)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0034] device (vlan20)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0039] device (vlan20)[Open vSwitch Port]: Activation: starting connection 'vlan20-port' (06560179-8f21-4840-89f1-e305670ae13b)
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0041] device (vlan21)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0048] device (vlan21)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0053] device (vlan21)[Open vSwitch Port]: Activation: starting connection 'vlan21-port' (e78382ba-8d43-4538-9f10-314df9dad09b)
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0055] device (vlan22)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0063] device (vlan22)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0068] device (vlan22)[Open vSwitch Port]: Activation: starting connection 'vlan22-port' (cbbe476e-61dc-48d9-a4bc-3925a8944b42)
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0070] device (vlan23)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0079] device (vlan23)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0085] device (vlan23)[Open vSwitch Port]: Activation: starting connection 'vlan23-port' (7516e499-11e3-42d1-a6da-28b443cf8217)
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0085] device (br-ex)[Open vSwitch Bridge]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0088] device (br-ex)[Open vSwitch Bridge]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0090] device (br-ex)[Open vSwitch Bridge]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0100] device (br-ex)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0105] device (br-ex)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0110] device (br-ex)[Open vSwitch Interface]: Activation: starting connection 'br-ex-if' (2a8ba0ca-93e0-4e84-96a1-eb2bf4feb098)
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0110] device (br-ex)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0115] device (br-ex)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0117] device (br-ex)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0118] device (br-ex)[Open vSwitch Port]: Activation: connection 'br-ex-port' attached as port, continuing activation
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0119] device (eth1): state change: activated -> deactivating (reason 'new-activation', managed-type: 'full')
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0134] device (eth1): disconnecting for new activation request.
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0134] device (eth1)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0137] device (eth1)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0139] device (eth1)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0140] device (eth1)[Open vSwitch Port]: Activation: connection 'eth1-port' attached as port, continuing activation
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0146] device (vlan20)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0152] device (vlan20)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0157] device (vlan20)[Open vSwitch Interface]: Activation: starting connection 'vlan20-if' (8c1c8581-b0c3-4a63-9c77-9fbc5756bf30)
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0158] device (vlan20)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0161] device (vlan20)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0164] device (vlan20)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0165] device (vlan20)[Open vSwitch Port]: Activation: connection 'vlan20-port' attached as port, continuing activation
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0169] device (vlan21)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0174] device (vlan21)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0179] device (vlan21)[Open vSwitch Interface]: Activation: starting connection 'vlan21-if' (844cc38d-873d-41cd-b14e-0e20f1031e80)
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0180] device (vlan21)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0183] device (vlan21)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0184] device (vlan21)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0185] device (vlan21)[Open vSwitch Port]: Activation: connection 'vlan21-port' attached as port, continuing activation
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0189] device (vlan22)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0232] device (vlan22)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0239] device (vlan22)[Open vSwitch Interface]: Activation: starting connection 'vlan22-if' (68b52fa8-2c3f-400a-aa45-d937eefe44a1)
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0240] device (vlan22)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0244] device (vlan22)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0246] device (vlan22)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0247] device (vlan22)[Open vSwitch Port]: Activation: connection 'vlan22-port' attached as port, continuing activation
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0250] device (vlan23)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0256] device (vlan23)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0263] device (vlan23)[Open vSwitch Interface]: Activation: starting connection 'vlan23-if' (4184a6b0-b7b9-414f-8c5e-d7c69e6b028e)
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0264] device (vlan23)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0267] device (vlan23)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0270] device (vlan23)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0272] device (vlan23)[Open vSwitch Port]: Activation: connection 'vlan23-port' attached as port, continuing activation
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0274] device (br-ex)[Open vSwitch Bridge]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0297] audit: op="device-reapply" interface="eth0" ifindex=2 args="connection.autoconnect-priority,ipv4.dhcp-client-id,ipv4.dhcp-timeout,ipv6.method,ipv6.addr-gen-mode,802-3-ethernet.mtu" pid=51731 uid=0 result="success"
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0300] device (br-ex)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0304] device (br-ex)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0305] device (br-ex)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0313] device (br-ex)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0317] device (eth1)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0321] device (vlan20)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0324] device (vlan20)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0325] device (vlan20)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0330] device (vlan20)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0334] device (vlan21)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0338] device (vlan21)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0339] device (vlan21)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0344] device (vlan21)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0347] device (vlan22)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0349] device (vlan22)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0350] device (vlan22)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548916 kernel: ovs-system: entered promiscuous mode
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0358] device (vlan22)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548916 systemd-udevd[51735]: Network interface NamePolicy= disabled on kernel command line.
Dec  6 04:33:16 np0005548916 kernel: Timeout policy base is empty
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0400] device (vlan23)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0409] device (vlan23)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0414] device (vlan23)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548916 systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0429] device (vlan23)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0439] dhcp4 (eth0): canceled DHCP transaction
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0440] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0440] dhcp4 (eth0): state changed no lease
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0444] dhcp4 (eth0): activation: beginning transaction (no timeout)
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0464] device (br-ex)[Open vSwitch Interface]: Activation: connection 'br-ex-if' attached as port, continuing activation
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0471] audit: op="device-reapply" interface="eth1" ifindex=3 pid=51731 uid=0 result="fail" reason="Device is not activated"
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0486] device (vlan20)[Open vSwitch Interface]: Activation: connection 'vlan20-if' attached as port, continuing activation
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0496] device (vlan21)[Open vSwitch Interface]: Activation: connection 'vlan21-if' attached as port, continuing activation
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0508] device (eth1): disconnecting for new activation request.
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0509] audit: op="connection-activate" uuid="f3fb407f-d9e1-5507-a7f7-856240ad9666" name="ci-private-network" pid=51731 uid=0 result="success"
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0517] device (vlan22)[Open vSwitch Interface]: Activation: connection 'vlan22-if' attached as port, continuing activation
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0525] dhcp4 (eth0): state changed new lease, address=38.102.83.113
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0534] device (vlan23)[Open vSwitch Interface]: Activation: connection 'vlan23-if' attached as port, continuing activation
Dec  6 04:33:16 np0005548916 systemd[1]: Started Network Manager Script Dispatcher Service.
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0612] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0734] device (eth1): Activation: starting connection 'ci-private-network' (f3fb407f-d9e1-5507-a7f7-856240ad9666)
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0743] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51731 uid=0 result="success"
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0744] device (br-ex)[Open vSwitch Bridge]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0747] device (br-ex)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0749] device (eth1)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0753] device (vlan20)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0757] device (vlan21)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0771] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0782] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0795] device (br-ex)[Open vSwitch Bridge]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0809] device (br-ex)[Open vSwitch Bridge]: Activation: successful, device activated.
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0823] device (br-ex)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0833] device (br-ex)[Open vSwitch Port]: Activation: successful, device activated.
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0839] device (eth1)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0847] device (eth1)[Open vSwitch Port]: Activation: successful, device activated.
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0854] device (vlan20)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0862] device (vlan20)[Open vSwitch Port]: Activation: successful, device activated.
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0869] device (vlan21)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0877] device (vlan21)[Open vSwitch Port]: Activation: successful, device activated.
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0884] device (vlan22)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0887] device (vlan23)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0892] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0902] device (vlan22)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0909] device (vlan22)[Open vSwitch Port]: Activation: successful, device activated.
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0916] device (vlan23)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548916 kernel: br-ex: entered promiscuous mode
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0923] device (vlan23)[Open vSwitch Port]: Activation: successful, device activated.
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0933] device (eth1): Activation: connection 'ci-private-network' attached as port, continuing activation
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.0940] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548916 kernel: virtio_net virtio5 eth1: entered promiscuous mode
Dec  6 04:33:16 np0005548916 kernel: vlan22: entered promiscuous mode
Dec  6 04:33:16 np0005548916 systemd-udevd[51737]: Network interface NamePolicy= disabled on kernel command line.
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.1077] device (br-ex)[Open vSwitch Interface]: carrier: link connected
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.1084] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.1088] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.1096] device (eth1): Activation: successful, device activated.
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.1123] device (br-ex)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.1139] device (br-ex)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.1142] device (br-ex)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.1146] device (br-ex)[Open vSwitch Interface]: Activation: successful, device activated.
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.1187] device (vlan22)[Open vSwitch Interface]: carrier: link connected
Dec  6 04:33:16 np0005548916 kernel: vlan21: entered promiscuous mode
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.1236] device (vlan22)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.1273] device (vlan22)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548916 kernel: vlan20: entered promiscuous mode
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.1283] device (vlan22)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.1292] device (vlan22)[Open vSwitch Interface]: Activation: successful, device activated.
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.1313] device (vlan21)[Open vSwitch Interface]: carrier: link connected
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.1324] device (vlan21)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548916 kernel: vlan23: entered promiscuous mode
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.1532] device (vlan20)[Open vSwitch Interface]: carrier: link connected
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.1539] device (vlan23)[Open vSwitch Interface]: carrier: link connected
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.1541] device (vlan21)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.1553] device (vlan21)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.1559] device (vlan21)[Open vSwitch Interface]: Activation: successful, device activated.
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.1589] device (vlan20)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.1594] device (vlan23)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.1786] device (vlan20)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.1792] device (vlan23)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.1794] device (vlan20)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.1803] device (vlan20)[Open vSwitch Interface]: Activation: successful, device activated.
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.1807] device (vlan23)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548916 NetworkManager[48956]: <info>  [1765013596.1811] device (vlan23)[Open vSwitch Interface]: Activation: successful, device activated.
Dec  6 04:33:17 np0005548916 NetworkManager[48956]: <info>  [1765013597.3406] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51731 uid=0 result="success"
Dec  6 04:33:17 np0005548916 python3.9[52088]: ansible-ansible.legacy.async_status Invoked with jid=j948182577153.51725 mode=status _async_dir=/root/.ansible_async
Dec  6 04:33:17 np0005548916 NetworkManager[48956]: <info>  [1765013597.5249] checkpoint[0x557774a7c950]: destroy /org/freedesktop/NetworkManager/Checkpoint/1
Dec  6 04:33:17 np0005548916 NetworkManager[48956]: <info>  [1765013597.5251] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51731 uid=0 result="success"
Dec  6 04:33:17 np0005548916 NetworkManager[48956]: <info>  [1765013597.8427] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51731 uid=0 result="success"
Dec  6 04:33:17 np0005548916 NetworkManager[48956]: <info>  [1765013597.8445] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51731 uid=0 result="success"
Dec  6 04:33:18 np0005548916 NetworkManager[48956]: <info>  [1765013598.1084] audit: op="networking-control" arg="global-dns-configuration" pid=51731 uid=0 result="success"
Dec  6 04:33:18 np0005548916 NetworkManager[48956]: <info>  [1765013598.1122] config: signal: SET_VALUES,values,values-intern,global-dns-config (/etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf)
Dec  6 04:33:18 np0005548916 NetworkManager[48956]: <info>  [1765013598.1159] audit: op="networking-control" arg="global-dns-configuration" pid=51731 uid=0 result="success"
Dec  6 04:33:18 np0005548916 NetworkManager[48956]: <info>  [1765013598.1200] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51731 uid=0 result="success"
Dec  6 04:33:18 np0005548916 NetworkManager[48956]: <info>  [1765013598.2718] checkpoint[0x557774a7ca20]: destroy /org/freedesktop/NetworkManager/Checkpoint/2
Dec  6 04:33:18 np0005548916 NetworkManager[48956]: <info>  [1765013598.2723] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51731 uid=0 result="success"
Dec  6 04:33:18 np0005548916 ansible-async_wrapper.py[51729]: Module complete (51729)
Dec  6 04:33:18 np0005548916 ansible-async_wrapper.py[51728]: Done in kid B.
Dec  6 04:33:21 np0005548916 python3.9[52194]: ansible-ansible.legacy.async_status Invoked with jid=j948182577153.51725 mode=status _async_dir=/root/.ansible_async
Dec  6 04:33:21 np0005548916 python3.9[52294]: ansible-ansible.legacy.async_status Invoked with jid=j948182577153.51725 mode=cleanup _async_dir=/root/.ansible_async
Dec  6 04:33:21 np0005548916 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Dec  6 04:33:22 np0005548916 python3.9[52448]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:33:23 np0005548916 python3.9[52571]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765013602.005343-927-212777252682982/.source.returncode _original_basename=.aitz6_rl follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:33:24 np0005548916 python3.9[52723]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:33:24 np0005548916 python3.9[52847]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765013603.6428676-975-269970283608371/.source.cfg _original_basename=.v7quqsaw follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:33:25 np0005548916 python3.9[52999]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  6 04:33:25 np0005548916 systemd[1]: Reloading Network Manager...
Dec  6 04:33:25 np0005548916 NetworkManager[48956]: <info>  [1765013605.9513] audit: op="reload" arg="0" pid=53003 uid=0 result="success"
Dec  6 04:33:25 np0005548916 NetworkManager[48956]: <info>  [1765013605.9526] config: signal: SIGHUP,config-files,values,values-user,no-auto-default (/etc/NetworkManager/NetworkManager.conf, /usr/lib/NetworkManager/conf.d/00-server.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /var/lib/NetworkManager/NetworkManager-intern.conf)
Dec  6 04:33:26 np0005548916 systemd[1]: Reloaded Network Manager.
Dec  6 04:33:26 np0005548916 systemd[1]: session-11.scope: Deactivated successfully.
Dec  6 04:33:26 np0005548916 systemd[1]: session-11.scope: Consumed 59.382s CPU time.
Dec  6 04:33:26 np0005548916 systemd-logind[788]: Session 11 logged out. Waiting for processes to exit.
Dec  6 04:33:26 np0005548916 systemd-logind[788]: Removed session 11.
Dec  6 04:33:32 np0005548916 systemd-logind[788]: New session 12 of user zuul.
Dec  6 04:33:32 np0005548916 systemd[1]: Started Session 12 of User zuul.
Dec  6 04:33:33 np0005548916 python3.9[53187]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  6 04:33:34 np0005548916 python3.9[53342]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  6 04:33:36 np0005548916 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec  6 04:33:37 np0005548916 python3.9[53537]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:33:37 np0005548916 systemd[1]: session-12.scope: Deactivated successfully.
Dec  6 04:33:37 np0005548916 systemd[1]: session-12.scope: Consumed 2.570s CPU time.
Dec  6 04:33:37 np0005548916 systemd-logind[788]: Session 12 logged out. Waiting for processes to exit.
Dec  6 04:33:37 np0005548916 systemd-logind[788]: Removed session 12.
Dec  6 04:33:43 np0005548916 systemd-logind[788]: New session 13 of user zuul.
Dec  6 04:33:43 np0005548916 systemd[1]: Started Session 13 of User zuul.
Dec  6 04:33:44 np0005548916 python3.9[53719]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  6 04:33:46 np0005548916 python3.9[53873]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  6 04:33:47 np0005548916 python3.9[54030]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  6 04:33:48 np0005548916 python3.9[54114]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  6 04:33:50 np0005548916 python3.9[54267]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  6 04:33:52 np0005548916 python3.9[54462]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:33:53 np0005548916 python3.9[54614]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:33:53 np0005548916 systemd[1]: var-lib-containers-storage-overlay-compat2171939770-merged.mount: Deactivated successfully.
Dec  6 04:33:53 np0005548916 podman[54615]: 2025-12-06 09:33:53.190277393 +0000 UTC m=+0.080201695 system refresh
Dec  6 04:33:54 np0005548916 python3.9[54778]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:33:54 np0005548916 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  6 04:33:55 np0005548916 python3.9[54901]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/networks/podman.json group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013633.5129702-198-186887960172822/.source.json follow=False _original_basename=podman_network_config.j2 checksum=9e9c5ca233623e32c18f7aced1026064b2947e96 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:33:55 np0005548916 python3.9[55053]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:33:56 np0005548916 python3.9[55176]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765013635.2833784-243-215436565585376/.source.conf follow=False _original_basename=registries.conf.j2 checksum=804a0d01b832e60d20f779a331306df708c87b02 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:33:57 np0005548916 python3.9[55328]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:33:58 np0005548916 python3.9[55480]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:33:58 np0005548916 python3.9[55632]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:33:59 np0005548916 python3.9[55784]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:34:00 np0005548916 python3.9[55936]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  6 04:34:03 np0005548916 python3.9[56089]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  6 04:34:04 np0005548916 python3.9[56243]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  6 04:34:05 np0005548916 python3.9[56395]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  6 04:34:06 np0005548916 python3.9[56547]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:34:07 np0005548916 python3.9[56700]: ansible-service_facts Invoked
Dec  6 04:34:07 np0005548916 network[56717]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec  6 04:34:07 np0005548916 network[56718]: 'network-scripts' will be removed from distribution in near future.
Dec  6 04:34:07 np0005548916 network[56719]: It is advised to switch to 'NetworkManager' instead for network management.
Dec  6 04:34:14 np0005548916 python3.9[57171]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  6 04:34:18 np0005548916 python3.9[57324]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Dec  6 04:34:19 np0005548916 python3.9[57476]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:34:20 np0005548916 python3.9[57601]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765013659.1046736-676-27212550812926/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:34:21 np0005548916 python3.9[57755]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:34:22 np0005548916 python3.9[57880]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765013660.749061-721-146435706367135/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:34:23 np0005548916 python3.9[58034]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:34:25 np0005548916 python3.9[58188]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  6 04:34:26 np0005548916 python3.9[58272]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  6 04:34:28 np0005548916 python3.9[58426]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  6 04:34:29 np0005548916 python3.9[58510]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  6 04:34:29 np0005548916 chronyd[796]: chronyd exiting
Dec  6 04:34:29 np0005548916 systemd[1]: Stopping NTP client/server...
Dec  6 04:34:29 np0005548916 systemd[1]: chronyd.service: Deactivated successfully.
Dec  6 04:34:29 np0005548916 systemd[1]: Stopped NTP client/server.
Dec  6 04:34:29 np0005548916 systemd[1]: Starting NTP client/server...
Dec  6 04:34:29 np0005548916 chronyd[58518]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Dec  6 04:34:29 np0005548916 chronyd[58518]: Frequency -23.235 +/- 0.721 ppm read from /var/lib/chrony/drift
Dec  6 04:34:29 np0005548916 chronyd[58518]: Loaded seccomp filter (level 2)
Dec  6 04:34:29 np0005548916 systemd[1]: Started NTP client/server.
Dec  6 04:34:30 np0005548916 systemd[1]: session-13.scope: Deactivated successfully.
Dec  6 04:34:30 np0005548916 systemd[1]: session-13.scope: Consumed 28.919s CPU time.
Dec  6 04:34:30 np0005548916 systemd-logind[788]: Session 13 logged out. Waiting for processes to exit.
Dec  6 04:34:30 np0005548916 systemd-logind[788]: Removed session 13.
Dec  6 04:34:36 np0005548916 systemd-logind[788]: New session 14 of user zuul.
Dec  6 04:34:36 np0005548916 systemd[1]: Started Session 14 of User zuul.
Dec  6 04:34:37 np0005548916 python3.9[58699]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:34:37 np0005548916 python3.9[58851]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/ceph-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:34:38 np0005548916 python3.9[58974]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/ceph-networks.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765013677.2775483-63-73203465115698/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=729ea8396013e3343245d6e934e0dcef55029ad2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:34:39 np0005548916 systemd[1]: session-14.scope: Deactivated successfully.
Dec  6 04:34:39 np0005548916 systemd[1]: session-14.scope: Consumed 1.855s CPU time.
Dec  6 04:34:39 np0005548916 systemd-logind[788]: Session 14 logged out. Waiting for processes to exit.
Dec  6 04:34:39 np0005548916 systemd-logind[788]: Removed session 14.
Dec  6 04:34:45 np0005548916 systemd-logind[788]: New session 15 of user zuul.
Dec  6 04:34:45 np0005548916 systemd[1]: Started Session 15 of User zuul.
Dec  6 04:34:46 np0005548916 python3.9[59152]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  6 04:34:48 np0005548916 python3.9[59308]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:34:49 np0005548916 python3.9[59483]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:34:49 np0005548916 python3.9[59606]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1765013688.536725-84-53368865078967/.source.json _original_basename=.qnh66egr follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:34:51 np0005548916 python3.9[59758]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:34:51 np0005548916 python3.9[59881]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765013690.6593707-153-263416561976080/.source _original_basename=.79vzxolj follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:34:52 np0005548916 python3.9[60033]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:34:53 np0005548916 python3.9[60185]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:34:54 np0005548916 python3.9[60308]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765013692.8972878-225-279198583258885/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:34:54 np0005548916 python3.9[60460]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:34:55 np0005548916 python3.9[60583]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765013694.2990668-225-163750403702538/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:34:56 np0005548916 python3.9[60735]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:34:57 np0005548916 python3.9[60887]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:34:58 np0005548916 python3.9[61010]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013696.9059095-336-90911090076190/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:34:59 np0005548916 python3.9[61162]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:34:59 np0005548916 python3.9[61285]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013698.5931203-381-208996469997144/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:35:01 np0005548916 python3.9[61437]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  6 04:35:01 np0005548916 systemd[1]: Reloading.
Dec  6 04:35:01 np0005548916 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:35:01 np0005548916 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:35:01 np0005548916 systemd[1]: Reloading.
Dec  6 04:35:01 np0005548916 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:35:01 np0005548916 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:35:01 np0005548916 systemd[1]: Starting EDPM Container Shutdown...
Dec  6 04:35:01 np0005548916 systemd[1]: Finished EDPM Container Shutdown.
Dec  6 04:35:02 np0005548916 python3.9[61665]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:35:03 np0005548916 python3.9[61788]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013702.0467942-450-212471631730688/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:35:04 np0005548916 python3.9[61940]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:35:04 np0005548916 python3.9[62063]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013703.4939406-495-6268607295447/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:35:05 np0005548916 python3.9[62215]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  6 04:35:05 np0005548916 systemd[1]: Reloading.
Dec  6 04:35:05 np0005548916 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:35:05 np0005548916 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:35:06 np0005548916 systemd[1]: Reloading.
Dec  6 04:35:06 np0005548916 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:35:06 np0005548916 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:35:06 np0005548916 systemd[1]: Starting Create netns directory...
Dec  6 04:35:06 np0005548916 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec  6 04:35:06 np0005548916 systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec  6 04:35:06 np0005548916 systemd[1]: Finished Create netns directory.
Dec  6 04:35:08 np0005548916 python3.9[62441]: ansible-ansible.builtin.service_facts Invoked
Dec  6 04:35:09 np0005548916 network[62458]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec  6 04:35:09 np0005548916 network[62459]: 'network-scripts' will be removed from distribution in near future.
Dec  6 04:35:09 np0005548916 network[62460]: It is advised to switch to 'NetworkManager' instead for network management.
Dec  6 04:35:13 np0005548916 python3.9[62722]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iptables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  6 04:35:13 np0005548916 systemd[1]: Reloading.
Dec  6 04:35:13 np0005548916 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:35:13 np0005548916 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:35:13 np0005548916 systemd[1]: Stopping IPv4 firewall with iptables...
Dec  6 04:35:14 np0005548916 iptables.init[62762]: iptables: Setting chains to policy ACCEPT: raw mangle filter nat [  OK  ]
Dec  6 04:35:14 np0005548916 iptables.init[62762]: iptables: Flushing firewall rules: [  OK  ]
Dec  6 04:35:14 np0005548916 systemd[1]: iptables.service: Deactivated successfully.
Dec  6 04:35:14 np0005548916 systemd[1]: Stopped IPv4 firewall with iptables.
Dec  6 04:35:14 np0005548916 python3.9[62959]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ip6tables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  6 04:35:17 np0005548916 python3.9[63113]: ansible-ansible.builtin.systemd Invoked with enabled=True name=nftables state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  6 04:35:17 np0005548916 systemd[1]: Reloading.
Dec  6 04:35:17 np0005548916 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:35:17 np0005548916 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:35:17 np0005548916 systemd[1]: Starting Netfilter Tables...
Dec  6 04:35:17 np0005548916 systemd[1]: Finished Netfilter Tables.
Dec  6 04:35:18 np0005548916 python3.9[63306]: ansible-ansible.legacy.command Invoked with _raw_params=nft flush ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:35:19 np0005548916 python3.9[63459]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:35:20 np0005548916 python3.9[63584]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1765013719.4834769-702-267801939867960/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=6c79f4cb960ad444688fde322eeacb8402e22d79 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:35:21 np0005548916 python3.9[63737]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  6 04:35:21 np0005548916 systemd[1]: Reloading OpenSSH server daemon...
Dec  6 04:35:21 np0005548916 systemd[1]: Reloaded OpenSSH server daemon.
Dec  6 04:35:23 np0005548916 python3.9[63893]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:35:24 np0005548916 python3.9[64045]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:35:25 np0005548916 python3.9[64168]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013723.897249-795-20485533924408/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:35:27 np0005548916 python3.9[64320]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Dec  6 04:35:27 np0005548916 systemd[1]: Starting Time & Date Service...
Dec  6 04:35:27 np0005548916 systemd[1]: Started Time & Date Service.
Dec  6 04:35:28 np0005548916 python3.9[64476]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:35:29 np0005548916 python3.9[64628]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:35:29 np0005548916 python3.9[64751]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765013728.6404147-900-79958882178775/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:35:30 np0005548916 python3.9[64904]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:35:31 np0005548916 python3.9[65027]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765013730.1672492-945-261750151401702/.source.yaml _original_basename=.zarpue_k follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:35:32 np0005548916 python3.9[65179]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:35:32 np0005548916 python3.9[65302]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013731.7201493-990-277496101914856/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:35:33 np0005548916 python3.9[65454]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:35:34 np0005548916 python3.9[65607]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:35:35 np0005548916 python3[65760]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Dec  6 04:35:36 np0005548916 python3.9[65912]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:35:36 np0005548916 python3.9[66035]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013735.7590027-1107-109234224364961/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:35:37 np0005548916 python3.9[66187]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:35:38 np0005548916 python3.9[66310]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013737.3152022-1152-230299439609934/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:35:39 np0005548916 python3.9[66462]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:35:40 np0005548916 python3.9[66585]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013738.888836-1197-45251254807162/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:35:41 np0005548916 python3.9[66737]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:35:41 np0005548916 python3.9[66860]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013740.4567454-1242-262655640109519/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:35:42 np0005548916 python3.9[67012]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:35:43 np0005548916 python3.9[67135]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013741.9514697-1288-56654241395874/.source.nft follow=False _original_basename=ruleset.j2 checksum=693377dc03e5b6b24713cb537b18b88774724e35 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:35:43 np0005548916 python3.9[67287]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:35:44 np0005548916 python3.9[67439]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:35:46 np0005548916 python3.9[67598]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:35:47 np0005548916 python3.9[67751]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:35:48 np0005548916 python3.9[67905]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:35:49 np0005548916 python3.9[68057]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Dec  6 04:35:50 np0005548916 python3.9[68210]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Dec  6 04:35:51 np0005548916 systemd[1]: session-15.scope: Deactivated successfully.
Dec  6 04:35:51 np0005548916 systemd[1]: session-15.scope: Consumed 38.870s CPU time.
Dec  6 04:35:51 np0005548916 systemd-logind[788]: Session 15 logged out. Waiting for processes to exit.
Dec  6 04:35:51 np0005548916 systemd-logind[788]: Removed session 15.
Dec  6 04:35:56 np0005548916 systemd-logind[788]: New session 16 of user zuul.
Dec  6 04:35:56 np0005548916 systemd[1]: Started Session 16 of User zuul.
Dec  6 04:35:57 np0005548916 python3.9[68391]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Dec  6 04:35:57 np0005548916 systemd[1]: systemd-timedated.service: Deactivated successfully.
Dec  6 04:35:58 np0005548916 python3.9[68545]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  6 04:35:59 np0005548916 python3.9[68697]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  6 04:36:00 np0005548916 python3.9[68849]: ansible-ansible.builtin.blockinfile Invoked with block=compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCtvqYC0W0zPSX/plyJvm0q1VGDScYTNlcCdllukOe81JRfU3GhVusPZOX0xRSaLP/lmXtfqWcbBRCkLsmFrAo2EHn1CMqMr5WkhY4+rgApF+MGLDOUo57tlKZLPIwdL0SSY/Qv8lBfrqr7LUDZ7fTTTbqTzim/bncxg/u0KxSWBdvjfmYi13SwO65wDkFqSVYa3h8DNij6cRRjQ0fJuJ9Da860hmMnqo9GJMU6dq3zMXXn3YfuF4E4M0UQdlWmVW4EwBTzsfA1XYbSpW7VdRJw6esB4vZ9/Succj+XZiANoDqL9gXSEjNXVVWVbL/7aGJJF9LLQ3VVxmHdbYs1NcTI6Yy9d61zDJHnK/nlYHMhmAHxiDsZEpv0xF72LLzaI86xxvnbx4eUpnyW6LnKiUCYUAUrWIMpLiIbWUxeIoYmj9rqLhwlo5kCy7WdCYYEMTtGI53oIyU0EbXf/r4WAuzmqpVRPyc2Sd5tYD4aXh1JZLUcZy+NLR0Y4SA8RflKFcs=#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFDJYF6pUvFgGUbY2QEOHAq7ZEhRQJUqPTVPOuTyb476#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPJ19afQPeSMtr3O9L1fe5+bNzTAsOOCA5fLihUdryDYc29KKD+0XABHKIvqeefcCsIBjZRA//9OzCUftfvXK9A=#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDAiB67qk/R3IfGpcAH1Ojopc8KX94De+Kxs31cKQLD04X+4QRXPRdMxU85LOhN58eKoHaBi8cgqk7+dvRypGD5vbtbRN9r0VN7tGwiSQTlVFbEuhn0AEbnRwNAMWEEMHO9kEjufP4N2zEEhtQBXy9oO2tMX3+BX4Z3YZZMQyZUgohdBHp2VCul9VdRuo0oHSr8HHm0nN61dMjalnThmgkGAu5hG8qhkWT4i9hroSKBsR5kVBUFTqdXekYkVy4YIYfM2lBXiMOFHtvr1a+KOyIfgWMb7GBPW7oKqtzCfVgSbGaUhSvGzs1OWt3U/PjjapIlmDnwD5ukzVxWV5ldh0vA48tXh5R1wqAoN5/Y/RiAKaY2kd/fvtkhvVDGZluXOz5jJ02IFHm+v4dP3Ig8YOuS5BEkWFuJHkblW0t/+4siTHWwmGEuvUI6y8Gb2pGcBKsWCJtLePYzT09IAmrjwO0jAgbWy0nvCZ+SKlbBBrXP6OgNgMkA+GH9iGOl6FOuRok=#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGYNj3LmNvR0emoQHuuy9NKXPivs/dznunVy8GExnJl8#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJhKmGSvg8FMw16qKPzk6Pyj+OHkN3bmk20mts1PdCRcNRnn9sT1DgI6U8Aze1tjGPujT4eDL+Y9r/hsrfM4qDc=#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDneZurSARwLaZA1xEymzXlvVAPvP8u0PCrqXuMYD5ewImDDChRITnk4XHKT/DUfrSJf9/7oJsddEbLRjhCtedqrMZsCkWz1BxtCmPBuvz2LfFhEn27TjqYLctOVGigQGsj6ILvPOzzLiapd93yApWDmH6P0un/ltmdM0iZLygNpzG3HLF8STBXzlo/8slci69Em7XppcrOpl1TS7DaVlpNcRQvo9pFuIrbMD9g0DOdMwk5YCH6g7OzGWqq0gt0YUOztmsqxWHKav3E0SXAD/vkgRc/1ZCNGFNSvf0dIgimCF3xlNWrppnvNgQ1BRqiQ7RArlOp1bVg0Ugdce6f4TIrq36Ois2U5+/myF5WQ7l9hRMRvoP64hSSsRAIDobTI/zMStUP3iZPFngxDxwQtpydHfFGywBL9811c42U7JsGxE8890uOIDk/oOkyhSH6KHQCPFjmKBJ98nT01lgnXyFSNOqds6QOYBasUWNFWd2wS7YpTheGlVVM8bk/gB4K2L0=#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOMkn8zp09tRuEaH/bUoP0rYj+dziM1KcqMKxOgM9K1U#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBCrMdvJJYP0cflC7RDFsxwr66nSp9R7QU726CAfJcKLw6vHh8Z9Lw5wLH0kiaSpsb6SAPffloplHEDiwTOkghOc=#012 create=True mode=0644 path=/tmp/ansible.5_6wmhzq state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:36:01 np0005548916 python3.9[69001]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.5_6wmhzq' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:36:02 np0005548916 python3.9[69155]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.5_6wmhzq state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:36:02 np0005548916 systemd[1]: session-16.scope: Deactivated successfully.
Dec  6 04:36:02 np0005548916 systemd[1]: session-16.scope: Consumed 3.677s CPU time.
Dec  6 04:36:02 np0005548916 systemd-logind[788]: Session 16 logged out. Waiting for processes to exit.
Dec  6 04:36:02 np0005548916 systemd-logind[788]: Removed session 16.
Dec  6 04:36:08 np0005548916 systemd-logind[788]: New session 17 of user zuul.
Dec  6 04:36:08 np0005548916 systemd[1]: Started Session 17 of User zuul.
Dec  6 04:36:09 np0005548916 python3.9[69333]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  6 04:36:10 np0005548916 python3.9[69489]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Dec  6 04:36:11 np0005548916 python3.9[69643]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  6 04:36:12 np0005548916 python3.9[69796]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:36:13 np0005548916 python3.9[69949]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  6 04:36:14 np0005548916 python3.9[70103]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:36:15 np0005548916 python3.9[70258]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:36:16 np0005548916 systemd[1]: session-17.scope: Deactivated successfully.
Dec  6 04:36:16 np0005548916 systemd[1]: session-17.scope: Consumed 4.888s CPU time.
Dec  6 04:36:16 np0005548916 systemd-logind[788]: Session 17 logged out. Waiting for processes to exit.
Dec  6 04:36:16 np0005548916 systemd-logind[788]: Removed session 17.
Dec  6 04:36:21 np0005548916 systemd-logind[788]: New session 18 of user zuul.
Dec  6 04:36:21 np0005548916 systemd[1]: Started Session 18 of User zuul.
Dec  6 04:36:22 np0005548916 python3.9[70436]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  6 04:36:23 np0005548916 python3.9[70592]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  6 04:36:24 np0005548916 python3.9[70676]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec  6 04:36:27 np0005548916 python3.9[70827]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:36:28 np0005548916 python3.9[70978]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec  6 04:36:29 np0005548916 python3.9[71128]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  6 04:36:29 np0005548916 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec  6 04:36:29 np0005548916 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec  6 04:36:30 np0005548916 python3.9[71279]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/config follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  6 04:36:30 np0005548916 systemd[1]: session-18.scope: Deactivated successfully.
Dec  6 04:36:30 np0005548916 systemd[1]: session-18.scope: Consumed 6.677s CPU time.
Dec  6 04:36:30 np0005548916 systemd-logind[788]: Session 18 logged out. Waiting for processes to exit.
Dec  6 04:36:30 np0005548916 systemd-logind[788]: Removed session 18.
Dec  6 04:36:38 np0005548916 chronyd[58518]: Selected source 23.133.168.246 (pool.ntp.org)
Dec  6 04:36:41 np0005548916 systemd-logind[788]: New session 19 of user zuul.
Dec  6 04:36:41 np0005548916 systemd[1]: Started Session 19 of User zuul.
Dec  6 04:36:49 np0005548916 python3[72045]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  6 04:36:51 np0005548916 python3[72140]: ansible-ansible.legacy.dnf Invoked with name=['util-linux', 'lvm2', 'jq', 'podman'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Dec  6 04:36:53 np0005548916 python3[72167]: ansible-ansible.builtin.stat Invoked with path=/dev/loop3 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec  6 04:36:53 np0005548916 python3[72193]: ansible-ansible.legacy.command Invoked with _raw_params=dd if=/dev/zero of=/var/lib/ceph-osd-0.img bs=1 count=0 seek=20G#012losetup /dev/loop3 /var/lib/ceph-osd-0.img#012lsblk _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:36:53 np0005548916 kernel: loop: module loaded
Dec  6 04:36:53 np0005548916 kernel: loop3: detected capacity change from 0 to 41943040
Dec  6 04:36:53 np0005548916 python3[72228]: ansible-ansible.legacy.command Invoked with _raw_params=pvcreate /dev/loop3#012vgcreate ceph_vg0 /dev/loop3#012lvcreate -n ceph_lv0 -l +100%FREE ceph_vg0#012lvs _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:36:53 np0005548916 lvm[72231]: PV /dev/loop3 not used.
Dec  6 04:36:54 np0005548916 lvm[72233]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec  6 04:36:54 np0005548916 systemd[1]: Started /usr/sbin/lvm vgchange -aay --autoactivation event ceph_vg0.
Dec  6 04:36:54 np0005548916 lvm[72243]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec  6 04:36:54 np0005548916 lvm[72243]: VG ceph_vg0 finished
Dec  6 04:36:54 np0005548916 lvm[72241]:  1 logical volume(s) in volume group "ceph_vg0" now active
Dec  6 04:36:54 np0005548916 systemd[1]: lvm-activate-ceph_vg0.service: Deactivated successfully.
Dec  6 04:36:54 np0005548916 python3[72321]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/ceph-osd-losetup-0.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  6 04:36:54 np0005548916 python3[72394]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765013814.3516102-36828-265389727803626/source dest=/etc/systemd/system/ceph-osd-losetup-0.service mode=0644 force=True follow=False _original_basename=ceph-osd-losetup.service.j2 checksum=427b1db064a970126b729b07acf99fa7d0eecb9c backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:36:55 np0005548916 python3[72444]: ansible-ansible.builtin.systemd Invoked with state=started enabled=True name=ceph-osd-losetup-0.service daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  6 04:36:55 np0005548916 systemd[1]: Reloading.
Dec  6 04:36:55 np0005548916 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:36:55 np0005548916 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:36:56 np0005548916 systemd[1]: Starting Ceph OSD losetup...
Dec  6 04:36:56 np0005548916 bash[72485]: /dev/loop3: [64513]:4327945 (/var/lib/ceph-osd-0.img)
Dec  6 04:36:56 np0005548916 systemd[1]: Finished Ceph OSD losetup.
Dec  6 04:36:56 np0005548916 lvm[72486]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec  6 04:36:56 np0005548916 lvm[72486]: VG ceph_vg0 finished
Dec  6 04:36:58 np0005548916 python3[72510]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  6 04:38:32 np0005548916 systemd[1]: Created slice User Slice of UID 42477.
Dec  6 04:38:32 np0005548916 systemd[1]: Starting User Runtime Directory /run/user/42477...
Dec  6 04:38:32 np0005548916 systemd-logind[788]: New session 20 of user ceph-admin.
Dec  6 04:38:32 np0005548916 systemd[1]: Finished User Runtime Directory /run/user/42477.
Dec  6 04:38:32 np0005548916 systemd[1]: Starting User Manager for UID 42477...
Dec  6 04:38:32 np0005548916 systemd[72560]: Queued start job for default target Main User Target.
Dec  6 04:38:32 np0005548916 systemd[72560]: Created slice User Application Slice.
Dec  6 04:38:32 np0005548916 systemd[72560]: Started Mark boot as successful after the user session has run 2 minutes.
Dec  6 04:38:32 np0005548916 systemd[72560]: Started Daily Cleanup of User's Temporary Directories.
Dec  6 04:38:32 np0005548916 systemd[72560]: Reached target Paths.
Dec  6 04:38:32 np0005548916 systemd[72560]: Reached target Timers.
Dec  6 04:38:32 np0005548916 systemd[72560]: Starting D-Bus User Message Bus Socket...
Dec  6 04:38:32 np0005548916 systemd[72560]: Starting Create User's Volatile Files and Directories...
Dec  6 04:38:32 np0005548916 systemd[72560]: Listening on D-Bus User Message Bus Socket.
Dec  6 04:38:32 np0005548916 systemd[72560]: Finished Create User's Volatile Files and Directories.
Dec  6 04:38:32 np0005548916 systemd[72560]: Reached target Sockets.
Dec  6 04:38:32 np0005548916 systemd[72560]: Reached target Basic System.
Dec  6 04:38:32 np0005548916 systemd[72560]: Reached target Main User Target.
Dec  6 04:38:32 np0005548916 systemd[72560]: Startup finished in 159ms.
Dec  6 04:38:32 np0005548916 systemd[1]: Started User Manager for UID 42477.
Dec  6 04:38:32 np0005548916 systemd[1]: Started Session 20 of User ceph-admin.
Dec  6 04:38:32 np0005548916 systemd-logind[788]: New session 22 of user ceph-admin.
Dec  6 04:38:33 np0005548916 systemd[1]: Started Session 22 of User ceph-admin.
Dec  6 04:38:33 np0005548916 systemd-logind[788]: New session 23 of user ceph-admin.
Dec  6 04:38:33 np0005548916 systemd[1]: Started Session 23 of User ceph-admin.
Dec  6 04:38:33 np0005548916 systemd-logind[788]: New session 24 of user ceph-admin.
Dec  6 04:38:33 np0005548916 systemd[1]: Started Session 24 of User ceph-admin.
Dec  6 04:38:34 np0005548916 systemd-logind[788]: New session 25 of user ceph-admin.
Dec  6 04:38:34 np0005548916 systemd[1]: Started Session 25 of User ceph-admin.
Dec  6 04:38:34 np0005548916 systemd-logind[788]: New session 26 of user ceph-admin.
Dec  6 04:38:34 np0005548916 systemd[1]: Started Session 26 of User ceph-admin.
Dec  6 04:38:34 np0005548916 systemd-logind[788]: New session 27 of user ceph-admin.
Dec  6 04:38:34 np0005548916 systemd[1]: Started Session 27 of User ceph-admin.
Dec  6 04:38:35 np0005548916 systemd-logind[788]: New session 28 of user ceph-admin.
Dec  6 04:38:35 np0005548916 systemd[1]: Started Session 28 of User ceph-admin.
Dec  6 04:38:35 np0005548916 systemd-logind[788]: New session 29 of user ceph-admin.
Dec  6 04:38:35 np0005548916 systemd[1]: Started Session 29 of User ceph-admin.
Dec  6 04:38:36 np0005548916 systemd-logind[788]: New session 30 of user ceph-admin.
Dec  6 04:38:36 np0005548916 systemd[1]: Started Session 30 of User ceph-admin.
Dec  6 04:38:37 np0005548916 systemd-logind[788]: New session 31 of user ceph-admin.
Dec  6 04:38:37 np0005548916 systemd[1]: Started Session 31 of User ceph-admin.
Dec  6 04:38:37 np0005548916 systemd-logind[788]: New session 32 of user ceph-admin.
Dec  6 04:38:37 np0005548916 systemd[1]: Started Session 32 of User ceph-admin.
Dec  6 04:38:38 np0005548916 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  6 04:38:38 np0005548916 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  6 04:38:39 np0005548916 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  6 04:38:39 np0005548916 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  6 04:38:39 np0005548916 systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 73135 (sysctl)
Dec  6 04:38:40 np0005548916 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  6 04:38:40 np0005548916 systemd[1]: Mounting Arbitrary Executable File Formats File System...
Dec  6 04:38:40 np0005548916 systemd[1]: Mounted Arbitrary Executable File Formats File System.
Dec  6 04:38:40 np0005548916 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  6 04:38:41 np0005548916 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  6 04:38:41 np0005548916 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  6 04:38:44 np0005548916 systemd[1]: var-lib-containers-storage-overlay-compat363875030-lower\x2dmapped.mount: Deactivated successfully.
Dec  6 04:39:12 np0005548916 podman[73310]: 2025-12-06 09:39:12.666852424 +0000 UTC m=+31.158415124 container create 81358b6ab17a0c17589dff90f711e3456d530a0f9c43daee99c79c2adfb69af6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=awesome_shaw, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  6 04:39:12 np0005548916 systemd[1]: var-lib-containers-storage-overlay-volatile\x2dcheck224748378-merged.mount: Deactivated successfully.
Dec  6 04:39:12 np0005548916 systemd[1]: Created slice Virtual Machine and Container Slice.
Dec  6 04:39:12 np0005548916 systemd[1]: Started libpod-conmon-81358b6ab17a0c17589dff90f711e3456d530a0f9c43daee99c79c2adfb69af6.scope.
Dec  6 04:39:12 np0005548916 podman[73310]: 2025-12-06 09:39:12.645746346 +0000 UTC m=+31.137309076 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  6 04:39:12 np0005548916 systemd[1]: Started libcrun container.
Dec  6 04:39:12 np0005548916 podman[73310]: 2025-12-06 09:39:12.7829792 +0000 UTC m=+31.274541940 container init 81358b6ab17a0c17589dff90f711e3456d530a0f9c43daee99c79c2adfb69af6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=awesome_shaw, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2)
Dec  6 04:39:12 np0005548916 podman[73310]: 2025-12-06 09:39:12.791273536 +0000 UTC m=+31.282836236 container start 81358b6ab17a0c17589dff90f711e3456d530a0f9c43daee99c79c2adfb69af6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=awesome_shaw, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_REF=squid, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  6 04:39:12 np0005548916 podman[73310]: 2025-12-06 09:39:12.795109018 +0000 UTC m=+31.286671718 container attach 81358b6ab17a0c17589dff90f711e3456d530a0f9c43daee99c79c2adfb69af6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=awesome_shaw, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  6 04:39:12 np0005548916 awesome_shaw[73370]: 167 167
Dec  6 04:39:12 np0005548916 systemd[1]: libpod-81358b6ab17a0c17589dff90f711e3456d530a0f9c43daee99c79c2adfb69af6.scope: Deactivated successfully.
Dec  6 04:39:12 np0005548916 conmon[73370]: conmon 81358b6ab17a0c17589d <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-81358b6ab17a0c17589dff90f711e3456d530a0f9c43daee99c79c2adfb69af6.scope/container/memory.events
Dec  6 04:39:12 np0005548916 podman[73310]: 2025-12-06 09:39:12.799311613 +0000 UTC m=+31.290874313 container died 81358b6ab17a0c17589dff90f711e3456d530a0f9c43daee99c79c2adfb69af6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=awesome_shaw, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325)
Dec  6 04:39:12 np0005548916 systemd[1]: var-lib-containers-storage-overlay-3390ceddcaa2e212e5838b73c4611355f5f37077196ff2af454b1a4d06d97f49-merged.mount: Deactivated successfully.
Dec  6 04:39:12 np0005548916 podman[73310]: 2025-12-06 09:39:12.850888853 +0000 UTC m=+31.342451573 container remove 81358b6ab17a0c17589dff90f711e3456d530a0f9c43daee99c79c2adfb69af6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=awesome_shaw, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_REF=squid, ceph=True, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325)
Dec  6 04:39:12 np0005548916 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  6 04:39:12 np0005548916 systemd[1]: libpod-conmon-81358b6ab17a0c17589dff90f711e3456d530a0f9c43daee99c79c2adfb69af6.scope: Deactivated successfully.
Dec  6 04:39:13 np0005548916 podman[73393]: 2025-12-06 09:39:13.039484058 +0000 UTC m=+0.031054122 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  6 04:39:13 np0005548916 podman[73393]: 2025-12-06 09:39:13.135787751 +0000 UTC m=+0.127357815 container create 37a0c4bd0e1d05c63bbc777d99062257deeadec2a53e34a65fcf3b447971ad3c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=amazing_goldstine, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, CEPH_REF=squid, ceph=True, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS)
Dec  6 04:39:13 np0005548916 systemd[1]: Started libpod-conmon-37a0c4bd0e1d05c63bbc777d99062257deeadec2a53e34a65fcf3b447971ad3c.scope.
Dec  6 04:39:13 np0005548916 systemd[1]: Started libcrun container.
Dec  6 04:39:13 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2b62d2f8ad36f61b24d5e4fac46ad435c02ec7b7627f4c92d752d96dc35f9e04/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  6 04:39:13 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2b62d2f8ad36f61b24d5e4fac46ad435c02ec7b7627f4c92d752d96dc35f9e04/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  6 04:39:13 np0005548916 podman[73393]: 2025-12-06 09:39:13.436139202 +0000 UTC m=+0.427709276 container init 37a0c4bd0e1d05c63bbc777d99062257deeadec2a53e34a65fcf3b447971ad3c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=amazing_goldstine, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, CEPH_REF=squid, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  6 04:39:13 np0005548916 podman[73393]: 2025-12-06 09:39:13.448930913 +0000 UTC m=+0.440500947 container start 37a0c4bd0e1d05c63bbc777d99062257deeadec2a53e34a65fcf3b447971ad3c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=amazing_goldstine, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS)
Dec  6 04:39:13 np0005548916 podman[73393]: 2025-12-06 09:39:13.490501067 +0000 UTC m=+0.482071141 container attach 37a0c4bd0e1d05c63bbc777d99062257deeadec2a53e34a65fcf3b447971ad3c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=amazing_goldstine, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, OSD_FLAVOR=default, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0)
Dec  6 04:39:14 np0005548916 amazing_goldstine[73408]: [
Dec  6 04:39:14 np0005548916 amazing_goldstine[73408]:    {
Dec  6 04:39:14 np0005548916 amazing_goldstine[73408]:        "available": false,
Dec  6 04:39:14 np0005548916 amazing_goldstine[73408]:        "being_replaced": false,
Dec  6 04:39:14 np0005548916 amazing_goldstine[73408]:        "ceph_device_lvm": false,
Dec  6 04:39:14 np0005548916 amazing_goldstine[73408]:        "device_id": "QEMU_DVD-ROM_QM00001",
Dec  6 04:39:14 np0005548916 amazing_goldstine[73408]:        "lsm_data": {},
Dec  6 04:39:14 np0005548916 amazing_goldstine[73408]:        "lvs": [],
Dec  6 04:39:14 np0005548916 amazing_goldstine[73408]:        "path": "/dev/sr0",
Dec  6 04:39:14 np0005548916 amazing_goldstine[73408]:        "rejected_reasons": [
Dec  6 04:39:14 np0005548916 amazing_goldstine[73408]:            "Has a FileSystem",
Dec  6 04:39:14 np0005548916 amazing_goldstine[73408]:            "Insufficient space (<5GB)"
Dec  6 04:39:14 np0005548916 amazing_goldstine[73408]:        ],
Dec  6 04:39:14 np0005548916 amazing_goldstine[73408]:        "sys_api": {
Dec  6 04:39:14 np0005548916 amazing_goldstine[73408]:            "actuators": null,
Dec  6 04:39:14 np0005548916 amazing_goldstine[73408]:            "device_nodes": [
Dec  6 04:39:14 np0005548916 amazing_goldstine[73408]:                "sr0"
Dec  6 04:39:14 np0005548916 amazing_goldstine[73408]:            ],
Dec  6 04:39:14 np0005548916 amazing_goldstine[73408]:            "devname": "sr0",
Dec  6 04:39:14 np0005548916 amazing_goldstine[73408]:            "human_readable_size": "482.00 KB",
Dec  6 04:39:14 np0005548916 amazing_goldstine[73408]:            "id_bus": "ata",
Dec  6 04:39:14 np0005548916 amazing_goldstine[73408]:            "model": "QEMU DVD-ROM",
Dec  6 04:39:14 np0005548916 amazing_goldstine[73408]:            "nr_requests": "2",
Dec  6 04:39:14 np0005548916 amazing_goldstine[73408]:            "parent": "/dev/sr0",
Dec  6 04:39:14 np0005548916 amazing_goldstine[73408]:            "partitions": {},
Dec  6 04:39:14 np0005548916 amazing_goldstine[73408]:            "path": "/dev/sr0",
Dec  6 04:39:14 np0005548916 amazing_goldstine[73408]:            "removable": "1",
Dec  6 04:39:14 np0005548916 amazing_goldstine[73408]:            "rev": "2.5+",
Dec  6 04:39:14 np0005548916 amazing_goldstine[73408]:            "ro": "0",
Dec  6 04:39:14 np0005548916 amazing_goldstine[73408]:            "rotational": "1",
Dec  6 04:39:14 np0005548916 amazing_goldstine[73408]:            "sas_address": "",
Dec  6 04:39:14 np0005548916 amazing_goldstine[73408]:            "sas_device_handle": "",
Dec  6 04:39:14 np0005548916 amazing_goldstine[73408]:            "scheduler_mode": "mq-deadline",
Dec  6 04:39:14 np0005548916 amazing_goldstine[73408]:            "sectors": 0,
Dec  6 04:39:14 np0005548916 amazing_goldstine[73408]:            "sectorsize": "2048",
Dec  6 04:39:14 np0005548916 amazing_goldstine[73408]:            "size": 493568.0,
Dec  6 04:39:14 np0005548916 amazing_goldstine[73408]:            "support_discard": "2048",
Dec  6 04:39:14 np0005548916 amazing_goldstine[73408]:            "type": "disk",
Dec  6 04:39:14 np0005548916 amazing_goldstine[73408]:            "vendor": "QEMU"
Dec  6 04:39:14 np0005548916 amazing_goldstine[73408]:        }
Dec  6 04:39:14 np0005548916 amazing_goldstine[73408]:    }
Dec  6 04:39:14 np0005548916 amazing_goldstine[73408]: ]
Dec  6 04:39:14 np0005548916 systemd[1]: libpod-37a0c4bd0e1d05c63bbc777d99062257deeadec2a53e34a65fcf3b447971ad3c.scope: Deactivated successfully.
Dec  6 04:39:14 np0005548916 podman[73393]: 2025-12-06 09:39:14.347697036 +0000 UTC m=+1.339267070 container died 37a0c4bd0e1d05c63bbc777d99062257deeadec2a53e34a65fcf3b447971ad3c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=amazing_goldstine, CEPH_REF=squid, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  6 04:39:14 np0005548916 systemd[1]: var-lib-containers-storage-overlay-2b62d2f8ad36f61b24d5e4fac46ad435c02ec7b7627f4c92d752d96dc35f9e04-merged.mount: Deactivated successfully.
Dec  6 04:39:14 np0005548916 podman[73393]: 2025-12-06 09:39:14.405298133 +0000 UTC m=+1.396868167 container remove 37a0c4bd0e1d05c63bbc777d99062257deeadec2a53e34a65fcf3b447971ad3c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=amazing_goldstine, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_REF=squid, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2)
Dec  6 04:39:14 np0005548916 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  6 04:39:14 np0005548916 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  6 04:39:14 np0005548916 systemd[1]: libpod-conmon-37a0c4bd0e1d05c63bbc777d99062257deeadec2a53e34a65fcf3b447971ad3c.scope: Deactivated successfully.
Dec  6 04:39:17 np0005548916 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  6 04:39:17 np0005548916 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  6 04:39:17 np0005548916 podman[75275]: 2025-12-06 09:39:17.83779864 +0000 UTC m=+0.071745646 container create ae2f9b8500defe93bab38ac395c58893e56aeb86aecf732da6628f4a5337c11d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=great_jones, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  6 04:39:17 np0005548916 systemd[1]: Started libpod-conmon-ae2f9b8500defe93bab38ac395c58893e56aeb86aecf732da6628f4a5337c11d.scope.
Dec  6 04:39:17 np0005548916 systemd[1]: Started libcrun container.
Dec  6 04:39:17 np0005548916 podman[75275]: 2025-12-06 09:39:17.809524735 +0000 UTC m=+0.043471801 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  6 04:39:17 np0005548916 podman[75275]: 2025-12-06 09:39:17.920087089 +0000 UTC m=+0.154034065 container init ae2f9b8500defe93bab38ac395c58893e56aeb86aecf732da6628f4a5337c11d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=great_jones, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2)
Dec  6 04:39:17 np0005548916 podman[75275]: 2025-12-06 09:39:17.928003432 +0000 UTC m=+0.161950398 container start ae2f9b8500defe93bab38ac395c58893e56aeb86aecf732da6628f4a5337c11d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=great_jones, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, CEPH_REF=squid, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  6 04:39:17 np0005548916 podman[75275]: 2025-12-06 09:39:17.932911641 +0000 UTC m=+0.166858667 container attach ae2f9b8500defe93bab38ac395c58893e56aeb86aecf732da6628f4a5337c11d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=great_jones, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default)
Dec  6 04:39:17 np0005548916 great_jones[75292]: 167 167
Dec  6 04:39:17 np0005548916 systemd[1]: libpod-ae2f9b8500defe93bab38ac395c58893e56aeb86aecf732da6628f4a5337c11d.scope: Deactivated successfully.
Dec  6 04:39:17 np0005548916 podman[75275]: 2025-12-06 09:39:17.935407048 +0000 UTC m=+0.169354014 container died ae2f9b8500defe93bab38ac395c58893e56aeb86aecf732da6628f4a5337c11d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=great_jones, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325)
Dec  6 04:39:17 np0005548916 podman[75275]: 2025-12-06 09:39:17.977327514 +0000 UTC m=+0.211274490 container remove ae2f9b8500defe93bab38ac395c58893e56aeb86aecf732da6628f4a5337c11d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=great_jones, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, ceph=True)
Dec  6 04:39:17 np0005548916 systemd[1]: libpod-conmon-ae2f9b8500defe93bab38ac395c58893e56aeb86aecf732da6628f4a5337c11d.scope: Deactivated successfully.
Dec  6 04:39:18 np0005548916 systemd[1]: Reloading.
Dec  6 04:39:18 np0005548916 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:39:18 np0005548916 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:39:18 np0005548916 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  6 04:39:18 np0005548916 systemd[1]: Reloading.
Dec  6 04:39:18 np0005548916 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:39:18 np0005548916 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:39:18 np0005548916 systemd[1]: Reached target All Ceph clusters and services.
Dec  6 04:39:18 np0005548916 systemd[1]: Reloading.
Dec  6 04:39:18 np0005548916 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:39:18 np0005548916 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:39:18 np0005548916 systemd[1]: Reached target Ceph cluster 5ecd3f74-dade-5fc4-92ce-8950ae424258.
Dec  6 04:39:18 np0005548916 systemd[1]: Reloading.
Dec  6 04:39:19 np0005548916 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:39:19 np0005548916 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:39:19 np0005548916 systemd[1]: Reloading.
Dec  6 04:39:19 np0005548916 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:39:19 np0005548916 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:39:19 np0005548916 systemd[1]: Created slice Slice /system/ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258.
Dec  6 04:39:19 np0005548916 systemd[1]: Reached target System Time Set.
Dec  6 04:39:19 np0005548916 systemd[1]: Reached target System Time Synchronized.
Dec  6 04:39:19 np0005548916 systemd[1]: Starting Ceph crash.compute-1 for 5ecd3f74-dade-5fc4-92ce-8950ae424258...
Dec  6 04:39:19 np0005548916 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  6 04:39:19 np0005548916 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  6 04:39:19 np0005548916 podman[75552]: 2025-12-06 09:39:19.857758551 +0000 UTC m=+0.055401112 container create 500f8c89b5c281d0ace139835b07ea5bc6259ce3e028d8284d57da97424b55e0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-crash-compute-1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, ceph=True, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS)
Dec  6 04:39:19 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e55095be0e98c93c69902ee386c80c8a804e8c3822ff70ec6394b1e9140badc8/merged/etc/ceph/ceph.client.crash.compute-1.keyring supports timestamps until 2038 (0x7fffffff)
Dec  6 04:39:19 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e55095be0e98c93c69902ee386c80c8a804e8c3822ff70ec6394b1e9140badc8/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  6 04:39:19 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e55095be0e98c93c69902ee386c80c8a804e8c3822ff70ec6394b1e9140badc8/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  6 04:39:19 np0005548916 podman[75552]: 2025-12-06 09:39:19.930565593 +0000 UTC m=+0.128208174 container init 500f8c89b5c281d0ace139835b07ea5bc6259ce3e028d8284d57da97424b55e0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-crash-compute-1, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Dec  6 04:39:19 np0005548916 podman[75552]: 2025-12-06 09:39:19.837746831 +0000 UTC m=+0.035389422 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  6 04:39:19 np0005548916 podman[75552]: 2025-12-06 09:39:19.941041364 +0000 UTC m=+0.138683925 container start 500f8c89b5c281d0ace139835b07ea5bc6259ce3e028d8284d57da97424b55e0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-crash-compute-1, OSD_FLAVOR=default, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS)
Dec  6 04:39:19 np0005548916 bash[75552]: 500f8c89b5c281d0ace139835b07ea5bc6259ce3e028d8284d57da97424b55e0
Dec  6 04:39:19 np0005548916 systemd[1]: Started Ceph crash.compute-1 for 5ecd3f74-dade-5fc4-92ce-8950ae424258.
Dec  6 04:39:20 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-crash-compute-1[75567]: INFO:ceph-crash:pinging cluster to exercise our key
Dec  6 04:39:20 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-crash-compute-1[75567]: 2025-12-06T09:39:20.101+0000 7f5bc2f50640 -1 auth: unable to find a keyring on /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin: (2) No such file or directory
Dec  6 04:39:20 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-crash-compute-1[75567]: 2025-12-06T09:39:20.101+0000 7f5bc2f50640 -1 AuthRegistry(0x7f5bbc0698f0) no keyring found at /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin, disabling cephx
Dec  6 04:39:20 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-crash-compute-1[75567]: 2025-12-06T09:39:20.102+0000 7f5bc2f50640 -1 auth: unable to find a keyring on /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin: (2) No such file or directory
Dec  6 04:39:20 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-crash-compute-1[75567]: 2025-12-06T09:39:20.102+0000 7f5bc2f50640 -1 AuthRegistry(0x7f5bc2f4eff0) no keyring found at /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin, disabling cephx
Dec  6 04:39:20 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-crash-compute-1[75567]: 2025-12-06T09:39:20.105+0000 7f5bc0cc5640 -1 monclient(hunting): handle_auth_bad_method server allowed_methods [2] but i only support [1]
Dec  6 04:39:20 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-crash-compute-1[75567]: 2025-12-06T09:39:20.106+0000 7f5bc2f50640 -1 monclient: authenticate NOTE: no keyring found; disabled cephx authentication
Dec  6 04:39:20 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-crash-compute-1[75567]: [errno 13] RADOS permission denied (error connecting to the cluster)
Dec  6 04:39:20 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-crash-compute-1[75567]: INFO:ceph-crash:monitoring path /var/lib/ceph/crash, delay 600s
Dec  6 04:39:20 np0005548916 podman[75674]: 2025-12-06 09:39:20.679575681 +0000 UTC m=+0.068553596 container create 67bdb980cdfa59ce40251524f957bace2877e08612d28f8836013f3113d57aa3 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=relaxed_borg, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid)
Dec  6 04:39:20 np0005548916 systemd[1]: Started libpod-conmon-67bdb980cdfa59ce40251524f957bace2877e08612d28f8836013f3113d57aa3.scope.
Dec  6 04:39:20 np0005548916 podman[75674]: 2025-12-06 09:39:20.656986782 +0000 UTC m=+0.045964727 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  6 04:39:20 np0005548916 systemd[1]: Started libcrun container.
Dec  6 04:39:20 np0005548916 podman[75674]: 2025-12-06 09:39:20.796891878 +0000 UTC m=+0.185869833 container init 67bdb980cdfa59ce40251524f957bace2877e08612d28f8836013f3113d57aa3 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=relaxed_borg, CEPH_REF=squid, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Dec  6 04:39:20 np0005548916 podman[75674]: 2025-12-06 09:39:20.808750197 +0000 UTC m=+0.197728152 container start 67bdb980cdfa59ce40251524f957bace2877e08612d28f8836013f3113d57aa3 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=relaxed_borg, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True)
Dec  6 04:39:20 np0005548916 podman[75674]: 2025-12-06 09:39:20.813198611 +0000 UTC m=+0.202176536 container attach 67bdb980cdfa59ce40251524f957bace2877e08612d28f8836013f3113d57aa3 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=relaxed_borg, ceph=True, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0)
Dec  6 04:39:20 np0005548916 relaxed_borg[75690]: 167 167
Dec  6 04:39:20 np0005548916 systemd[1]: libpod-67bdb980cdfa59ce40251524f957bace2877e08612d28f8836013f3113d57aa3.scope: Deactivated successfully.
Dec  6 04:39:20 np0005548916 conmon[75690]: conmon 67bdb980cdfa59ce4025 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-67bdb980cdfa59ce40251524f957bace2877e08612d28f8836013f3113d57aa3.scope/container/memory.events
Dec  6 04:39:20 np0005548916 podman[75674]: 2025-12-06 09:39:20.818443761 +0000 UTC m=+0.207421686 container died 67bdb980cdfa59ce40251524f957bace2877e08612d28f8836013f3113d57aa3 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=relaxed_borg, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_REF=squid, OSD_FLAVOR=default)
Dec  6 04:39:20 np0005548916 systemd[1]: var-lib-containers-storage-overlay-20b1b99cb844c6e5bc7706cb2e0c22d9e4165de86258b013d82eda69c16555fb-merged.mount: Deactivated successfully.
Dec  6 04:39:20 np0005548916 podman[75674]: 2025-12-06 09:39:20.87347854 +0000 UTC m=+0.262456455 container remove 67bdb980cdfa59ce40251524f957bace2877e08612d28f8836013f3113d57aa3 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=relaxed_borg, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, ceph=True, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS)
Dec  6 04:39:20 np0005548916 systemd[1]: libpod-conmon-67bdb980cdfa59ce40251524f957bace2877e08612d28f8836013f3113d57aa3.scope: Deactivated successfully.
Dec  6 04:39:21 np0005548916 podman[75715]: 2025-12-06 09:39:21.06625668 +0000 UTC m=+0.056768970 container create 2d59b1abb6cfc7b59f0ce21f44bba1626d59a9b342d84f2aeea702cab76cab92 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=festive_clarke, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec  6 04:39:21 np0005548916 systemd[1]: Started libpod-conmon-2d59b1abb6cfc7b59f0ce21f44bba1626d59a9b342d84f2aeea702cab76cab92.scope.
Dec  6 04:39:21 np0005548916 podman[75715]: 2025-12-06 09:39:21.037759787 +0000 UTC m=+0.028272157 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  6 04:39:21 np0005548916 systemd[1]: Started libcrun container.
Dec  6 04:39:21 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/837768d19173e8ce29e6225388bd01c3ba0dad9c01ad2b562e610b499719db21/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  6 04:39:21 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/837768d19173e8ce29e6225388bd01c3ba0dad9c01ad2b562e610b499719db21/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  6 04:39:21 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/837768d19173e8ce29e6225388bd01c3ba0dad9c01ad2b562e610b499719db21/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  6 04:39:21 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/837768d19173e8ce29e6225388bd01c3ba0dad9c01ad2b562e610b499719db21/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  6 04:39:21 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/837768d19173e8ce29e6225388bd01c3ba0dad9c01ad2b562e610b499719db21/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Dec  6 04:39:21 np0005548916 podman[75715]: 2025-12-06 09:39:21.163186843 +0000 UTC m=+0.153699233 container init 2d59b1abb6cfc7b59f0ce21f44bba1626d59a9b342d84f2aeea702cab76cab92 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=festive_clarke, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, OSD_FLAVOR=default)
Dec  6 04:39:21 np0005548916 podman[75715]: 2025-12-06 09:39:21.173565541 +0000 UTC m=+0.164077831 container start 2d59b1abb6cfc7b59f0ce21f44bba1626d59a9b342d84f2aeea702cab76cab92 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=festive_clarke, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, ceph=True, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  6 04:39:21 np0005548916 podman[75715]: 2025-12-06 09:39:21.178343256 +0000 UTC m=+0.168855606 container attach 2d59b1abb6cfc7b59f0ce21f44bba1626d59a9b342d84f2aeea702cab76cab92 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=festive_clarke, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, ceph=True, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Dec  6 04:39:21 np0005548916 festive_clarke[75731]: --> passed data devices: 0 physical, 1 LVM
Dec  6 04:39:21 np0005548916 festive_clarke[75731]: Running command: /usr/bin/ceph-authtool --gen-print-key
Dec  6 04:39:21 np0005548916 festive_clarke[75731]: Running command: /usr/bin/ceph-authtool --gen-print-key
Dec  6 04:39:21 np0005548916 festive_clarke[75731]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring -i - osd new a01bc6a6-e368-4763-a10f-41794e4ef717
Dec  6 04:39:22 np0005548916 festive_clarke[75731]: Running command: /usr/bin/mount -t tmpfs tmpfs /var/lib/ceph/osd/ceph-0
Dec  6 04:39:22 np0005548916 festive_clarke[75731]: Running command: /usr/bin/chown -h ceph:ceph /dev/ceph_vg0/ceph_lv0
Dec  6 04:39:22 np0005548916 festive_clarke[75731]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Dec  6 04:39:22 np0005548916 festive_clarke[75731]: Running command: /usr/bin/ln -s /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-0/block
Dec  6 04:39:22 np0005548916 festive_clarke[75731]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring mon getmap -o /var/lib/ceph/osd/ceph-0/activate.monmap
Dec  6 04:39:22 np0005548916 lvm[75794]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec  6 04:39:22 np0005548916 lvm[75794]: VG ceph_vg0 finished
Dec  6 04:39:22 np0005548916 festive_clarke[75731]: stderr: got monmap epoch 1
Dec  6 04:39:22 np0005548916 festive_clarke[75731]: --> Creating keyring file for osd.0
Dec  6 04:39:22 np0005548916 festive_clarke[75731]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0/keyring
Dec  6 04:39:22 np0005548916 festive_clarke[75731]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0/
Dec  6 04:39:22 np0005548916 festive_clarke[75731]: Running command: /usr/bin/ceph-osd --cluster ceph --osd-objectstore bluestore --mkfs -i 0 --monmap /var/lib/ceph/osd/ceph-0/activate.monmap --keyfile - --osdspec-affinity default_drive_group --osd-data /var/lib/ceph/osd/ceph-0/ --osd-uuid a01bc6a6-e368-4763-a10f-41794e4ef717 --setuser ceph --setgroup ceph
Dec  6 04:39:25 np0005548916 festive_clarke[75731]: stderr: 2025-12-06T09:39:22.973+0000 7f63b388f740 -1 bluestore(/var/lib/ceph/osd/ceph-0//block) No valid bdev label found
Dec  6 04:39:25 np0005548916 festive_clarke[75731]: stderr: 2025-12-06T09:39:23.242+0000 7f63b388f740 -1 bluestore(/var/lib/ceph/osd/ceph-0/) _read_fsid unparsable uuid
Dec  6 04:39:25 np0005548916 festive_clarke[75731]: --> ceph-volume lvm prepare successful for: ceph_vg0/ceph_lv0
Dec  6 04:39:26 np0005548916 festive_clarke[75731]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0
Dec  6 04:39:26 np0005548916 festive_clarke[75731]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg0/ceph_lv0 --path /var/lib/ceph/osd/ceph-0 --no-mon-config
Dec  6 04:39:26 np0005548916 festive_clarke[75731]: Running command: /usr/bin/ln -snf /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-0/block
Dec  6 04:39:26 np0005548916 festive_clarke[75731]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-0/block
Dec  6 04:39:26 np0005548916 festive_clarke[75731]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Dec  6 04:39:26 np0005548916 festive_clarke[75731]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0
Dec  6 04:39:26 np0005548916 festive_clarke[75731]: --> ceph-volume lvm activate successful for osd ID: 0
Dec  6 04:39:26 np0005548916 festive_clarke[75731]: --> ceph-volume lvm create successful for: ceph_vg0/ceph_lv0
Dec  6 04:39:26 np0005548916 systemd[1]: libpod-2d59b1abb6cfc7b59f0ce21f44bba1626d59a9b342d84f2aeea702cab76cab92.scope: Deactivated successfully.
Dec  6 04:39:26 np0005548916 systemd[1]: libpod-2d59b1abb6cfc7b59f0ce21f44bba1626d59a9b342d84f2aeea702cab76cab92.scope: Consumed 2.267s CPU time.
Dec  6 04:39:26 np0005548916 podman[76696]: 2025-12-06 09:39:26.483626628 +0000 UTC m=+0.049970705 container died 2d59b1abb6cfc7b59f0ce21f44bba1626d59a9b342d84f2aeea702cab76cab92 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=festive_clarke, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  6 04:39:26 np0005548916 systemd[1]: var-lib-containers-storage-overlay-837768d19173e8ce29e6225388bd01c3ba0dad9c01ad2b562e610b499719db21-merged.mount: Deactivated successfully.
Dec  6 04:39:26 np0005548916 podman[76696]: 2025-12-06 09:39:26.527655266 +0000 UTC m=+0.093999313 container remove 2d59b1abb6cfc7b59f0ce21f44bba1626d59a9b342d84f2aeea702cab76cab92 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=festive_clarke, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325)
Dec  6 04:39:26 np0005548916 systemd[1]: libpod-conmon-2d59b1abb6cfc7b59f0ce21f44bba1626d59a9b342d84f2aeea702cab76cab92.scope: Deactivated successfully.
Dec  6 04:39:27 np0005548916 podman[76800]: 2025-12-06 09:39:27.179532324 +0000 UTC m=+0.052658968 container create 6fe09e9dc68e97efb9be36e4093497816872a51599b7f384c45bc6215455cb16 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=fervent_shockley, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, io.buildah.version=1.40.1)
Dec  6 04:39:27 np0005548916 systemd[1]: Started libpod-conmon-6fe09e9dc68e97efb9be36e4093497816872a51599b7f384c45bc6215455cb16.scope.
Dec  6 04:39:27 np0005548916 podman[76800]: 2025-12-06 09:39:27.159218663 +0000 UTC m=+0.032345297 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  6 04:39:27 np0005548916 systemd[1]: Started libcrun container.
Dec  6 04:39:27 np0005548916 podman[76800]: 2025-12-06 09:39:27.293847397 +0000 UTC m=+0.166974041 container init 6fe09e9dc68e97efb9be36e4093497816872a51599b7f384c45bc6215455cb16 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=fervent_shockley, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec  6 04:39:27 np0005548916 podman[76800]: 2025-12-06 09:39:27.303874643 +0000 UTC m=+0.177001287 container start 6fe09e9dc68e97efb9be36e4093497816872a51599b7f384c45bc6215455cb16 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=fervent_shockley, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, CEPH_REF=squid, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2)
Dec  6 04:39:27 np0005548916 podman[76800]: 2025-12-06 09:39:27.308511063 +0000 UTC m=+0.181637707 container attach 6fe09e9dc68e97efb9be36e4093497816872a51599b7f384c45bc6215455cb16 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=fervent_shockley, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  6 04:39:27 np0005548916 fervent_shockley[76816]: 167 167
Dec  6 04:39:27 np0005548916 systemd[1]: libpod-6fe09e9dc68e97efb9be36e4093497816872a51599b7f384c45bc6215455cb16.scope: Deactivated successfully.
Dec  6 04:39:27 np0005548916 podman[76800]: 2025-12-06 09:39:27.312430738 +0000 UTC m=+0.185557382 container died 6fe09e9dc68e97efb9be36e4093497816872a51599b7f384c45bc6215455cb16 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=fervent_shockley, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0)
Dec  6 04:39:27 np0005548916 systemd[1]: var-lib-containers-storage-overlay-37eb6bb8d3725a7677d32a1e7a38351260f1bc45e905ced346ddf387b6b8d75a-merged.mount: Deactivated successfully.
Dec  6 04:39:27 np0005548916 podman[76800]: 2025-12-06 09:39:27.36465187 +0000 UTC m=+0.237778474 container remove 6fe09e9dc68e97efb9be36e4093497816872a51599b7f384c45bc6215455cb16 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=fervent_shockley, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, ceph=True, io.buildah.version=1.40.1, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  6 04:39:27 np0005548916 systemd[1]: libpod-conmon-6fe09e9dc68e97efb9be36e4093497816872a51599b7f384c45bc6215455cb16.scope: Deactivated successfully.
Dec  6 04:39:27 np0005548916 podman[76839]: 2025-12-06 09:39:27.568817973 +0000 UTC m=+0.060288971 container create 2801b7c0390a8d7af79d9e28e1e2c2d9cf7f13678710e708f19be67b5d219238 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=objective_noyce, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, io.buildah.version=1.40.1)
Dec  6 04:39:27 np0005548916 systemd[1]: Started libpod-conmon-2801b7c0390a8d7af79d9e28e1e2c2d9cf7f13678710e708f19be67b5d219238.scope.
Dec  6 04:39:27 np0005548916 podman[76839]: 2025-12-06 09:39:27.540086321 +0000 UTC m=+0.031557349 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  6 04:39:27 np0005548916 systemd[1]: Started libcrun container.
Dec  6 04:39:27 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d8f587168e5728a14981cb5200db2fa6d4c36db187778d06fea6675aa383effb/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  6 04:39:27 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d8f587168e5728a14981cb5200db2fa6d4c36db187778d06fea6675aa383effb/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  6 04:39:27 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d8f587168e5728a14981cb5200db2fa6d4c36db187778d06fea6675aa383effb/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  6 04:39:27 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d8f587168e5728a14981cb5200db2fa6d4c36db187778d06fea6675aa383effb/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  6 04:39:27 np0005548916 podman[76839]: 2025-12-06 09:39:27.677613926 +0000 UTC m=+0.169084904 container init 2801b7c0390a8d7af79d9e28e1e2c2d9cf7f13678710e708f19be67b5d219238 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=objective_noyce, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  6 04:39:27 np0005548916 podman[76839]: 2025-12-06 09:39:27.688674247 +0000 UTC m=+0.180145205 container start 2801b7c0390a8d7af79d9e28e1e2c2d9cf7f13678710e708f19be67b5d219238 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=objective_noyce, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, ceph=True, CEPH_REF=squid)
Dec  6 04:39:27 np0005548916 podman[76839]: 2025-12-06 09:39:27.692564831 +0000 UTC m=+0.184035789 container attach 2801b7c0390a8d7af79d9e28e1e2c2d9cf7f13678710e708f19be67b5d219238 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=objective_noyce, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default)
Dec  6 04:39:27 np0005548916 objective_noyce[76856]: {
Dec  6 04:39:27 np0005548916 objective_noyce[76856]:    "0": [
Dec  6 04:39:28 np0005548916 objective_noyce[76856]:        {
Dec  6 04:39:28 np0005548916 objective_noyce[76856]:            "devices": [
Dec  6 04:39:28 np0005548916 objective_noyce[76856]:                "/dev/loop3"
Dec  6 04:39:28 np0005548916 objective_noyce[76856]:            ],
Dec  6 04:39:28 np0005548916 objective_noyce[76856]:            "lv_name": "ceph_lv0",
Dec  6 04:39:28 np0005548916 objective_noyce[76856]:            "lv_path": "/dev/ceph_vg0/ceph_lv0",
Dec  6 04:39:28 np0005548916 objective_noyce[76856]:            "lv_size": "21470642176",
Dec  6 04:39:28 np0005548916 objective_noyce[76856]:            "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=vDR22O-WywQ-swhh-zHBg-feef-qNj1-Dqh00z,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5ecd3f74-dade-5fc4-92ce-8950ae424258,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=a01bc6a6-e368-4763-a10f-41794e4ef717,ceph.osd_id=0,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Dec  6 04:39:28 np0005548916 objective_noyce[76856]:            "lv_uuid": "vDR22O-WywQ-swhh-zHBg-feef-qNj1-Dqh00z",
Dec  6 04:39:28 np0005548916 objective_noyce[76856]:            "name": "ceph_lv0",
Dec  6 04:39:28 np0005548916 objective_noyce[76856]:            "path": "/dev/ceph_vg0/ceph_lv0",
Dec  6 04:39:28 np0005548916 objective_noyce[76856]:            "tags": {
Dec  6 04:39:28 np0005548916 objective_noyce[76856]:                "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Dec  6 04:39:28 np0005548916 objective_noyce[76856]:                "ceph.block_uuid": "vDR22O-WywQ-swhh-zHBg-feef-qNj1-Dqh00z",
Dec  6 04:39:28 np0005548916 objective_noyce[76856]:                "ceph.cephx_lockbox_secret": "",
Dec  6 04:39:28 np0005548916 objective_noyce[76856]:                "ceph.cluster_fsid": "5ecd3f74-dade-5fc4-92ce-8950ae424258",
Dec  6 04:39:28 np0005548916 objective_noyce[76856]:                "ceph.cluster_name": "ceph",
Dec  6 04:39:28 np0005548916 objective_noyce[76856]:                "ceph.crush_device_class": "",
Dec  6 04:39:28 np0005548916 objective_noyce[76856]:                "ceph.encrypted": "0",
Dec  6 04:39:28 np0005548916 objective_noyce[76856]:                "ceph.osd_fsid": "a01bc6a6-e368-4763-a10f-41794e4ef717",
Dec  6 04:39:28 np0005548916 objective_noyce[76856]:                "ceph.osd_id": "0",
Dec  6 04:39:28 np0005548916 objective_noyce[76856]:                "ceph.osdspec_affinity": "default_drive_group",
Dec  6 04:39:28 np0005548916 objective_noyce[76856]:                "ceph.type": "block",
Dec  6 04:39:28 np0005548916 objective_noyce[76856]:                "ceph.vdo": "0",
Dec  6 04:39:28 np0005548916 objective_noyce[76856]:                "ceph.with_tpm": "0"
Dec  6 04:39:28 np0005548916 objective_noyce[76856]:            },
Dec  6 04:39:28 np0005548916 objective_noyce[76856]:            "type": "block",
Dec  6 04:39:28 np0005548916 objective_noyce[76856]:            "vg_name": "ceph_vg0"
Dec  6 04:39:28 np0005548916 objective_noyce[76856]:        }
Dec  6 04:39:28 np0005548916 objective_noyce[76856]:    ]
Dec  6 04:39:28 np0005548916 objective_noyce[76856]: }
Dec  6 04:39:28 np0005548916 systemd[1]: libpod-2801b7c0390a8d7af79d9e28e1e2c2d9cf7f13678710e708f19be67b5d219238.scope: Deactivated successfully.
Dec  6 04:39:28 np0005548916 podman[76839]: 2025-12-06 09:39:28.044900046 +0000 UTC m=+0.536371054 container died 2801b7c0390a8d7af79d9e28e1e2c2d9cf7f13678710e708f19be67b5d219238 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=objective_noyce, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  6 04:39:28 np0005548916 systemd[1]: var-lib-containers-storage-overlay-d8f587168e5728a14981cb5200db2fa6d4c36db187778d06fea6675aa383effb-merged.mount: Deactivated successfully.
Dec  6 04:39:28 np0005548916 podman[76839]: 2025-12-06 09:39:28.104867264 +0000 UTC m=+0.596338262 container remove 2801b7c0390a8d7af79d9e28e1e2c2d9cf7f13678710e708f19be67b5d219238 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=objective_noyce, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Dec  6 04:39:28 np0005548916 systemd[1]: libpod-conmon-2801b7c0390a8d7af79d9e28e1e2c2d9cf7f13678710e708f19be67b5d219238.scope: Deactivated successfully.
Dec  6 04:39:28 np0005548916 podman[76971]: 2025-12-06 09:39:28.834169411 +0000 UTC m=+0.052312125 container create dac94e84a5656c2b8c3f26ce2d3a4758c51f1560760ac45906fa8be5de884297 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=eager_jepsen, org.label-schema.license=GPLv2, CEPH_REF=squid, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  6 04:39:28 np0005548916 systemd[1]: Started libpod-conmon-dac94e84a5656c2b8c3f26ce2d3a4758c51f1560760ac45906fa8be5de884297.scope.
Dec  6 04:39:28 np0005548916 podman[76971]: 2025-12-06 09:39:28.810253436 +0000 UTC m=+0.028396170 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  6 04:39:28 np0005548916 systemd[1]: Started libcrun container.
Dec  6 04:39:28 np0005548916 podman[76971]: 2025-12-06 09:39:28.934890736 +0000 UTC m=+0.153033490 container init dac94e84a5656c2b8c3f26ce2d3a4758c51f1560760ac45906fa8be5de884297 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=eager_jepsen, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  6 04:39:28 np0005548916 podman[76971]: 2025-12-06 09:39:28.945378098 +0000 UTC m=+0.163520832 container start dac94e84a5656c2b8c3f26ce2d3a4758c51f1560760ac45906fa8be5de884297 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=eager_jepsen, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec  6 04:39:28 np0005548916 podman[76971]: 2025-12-06 09:39:28.949360015 +0000 UTC m=+0.167502719 container attach dac94e84a5656c2b8c3f26ce2d3a4758c51f1560760ac45906fa8be5de884297 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=eager_jepsen, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  6 04:39:28 np0005548916 eager_jepsen[76988]: 167 167
Dec  6 04:39:28 np0005548916 systemd[1]: libpod-dac94e84a5656c2b8c3f26ce2d3a4758c51f1560760ac45906fa8be5de884297.scope: Deactivated successfully.
Dec  6 04:39:28 np0005548916 podman[76971]: 2025-12-06 09:39:28.953294601 +0000 UTC m=+0.171437305 container died dac94e84a5656c2b8c3f26ce2d3a4758c51f1560760ac45906fa8be5de884297 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=eager_jepsen, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, CEPH_REF=squid, org.label-schema.schema-version=1.0, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  6 04:39:28 np0005548916 systemd[1]: var-lib-containers-storage-overlay-77326abbc5f0f3173a79671c7b1c210057b0a35127159252d53e06946da7f835-merged.mount: Deactivated successfully.
Dec  6 04:39:28 np0005548916 podman[76971]: 2025-12-06 09:39:28.998078186 +0000 UTC m=+0.216220900 container remove dac94e84a5656c2b8c3f26ce2d3a4758c51f1560760ac45906fa8be5de884297 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=eager_jepsen, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1)
Dec  6 04:39:29 np0005548916 systemd[1]: libpod-conmon-dac94e84a5656c2b8c3f26ce2d3a4758c51f1560760ac45906fa8be5de884297.scope: Deactivated successfully.
Dec  6 04:39:29 np0005548916 podman[77017]: 2025-12-06 09:39:29.255263048 +0000 UTC m=+0.054465700 container create 7d4cee0917fa854568a384399dea22bf6ded8f296932a2db4c62433b832cdb88 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-osd-0-activate-test, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  6 04:39:29 np0005548916 systemd[1]: Started libpod-conmon-7d4cee0917fa854568a384399dea22bf6ded8f296932a2db4c62433b832cdb88.scope.
Dec  6 04:39:29 np0005548916 podman[77017]: 2025-12-06 09:39:29.22691076 +0000 UTC m=+0.026113512 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  6 04:39:29 np0005548916 systemd[1]: Started libcrun container.
Dec  6 04:39:29 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5266eb2440c04747a6e446e4b53961da48954ef17a86adae1b17a8652f6bf5c9/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  6 04:39:29 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5266eb2440c04747a6e446e4b53961da48954ef17a86adae1b17a8652f6bf5c9/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  6 04:39:29 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5266eb2440c04747a6e446e4b53961da48954ef17a86adae1b17a8652f6bf5c9/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  6 04:39:29 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5266eb2440c04747a6e446e4b53961da48954ef17a86adae1b17a8652f6bf5c9/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  6 04:39:29 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5266eb2440c04747a6e446e4b53961da48954ef17a86adae1b17a8652f6bf5c9/merged/var/lib/ceph/osd/ceph-0 supports timestamps until 2038 (0x7fffffff)
Dec  6 04:39:29 np0005548916 podman[77017]: 2025-12-06 09:39:29.385606694 +0000 UTC m=+0.184809366 container init 7d4cee0917fa854568a384399dea22bf6ded8f296932a2db4c62433b832cdb88 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-osd-0-activate-test, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Dec  6 04:39:29 np0005548916 podman[77017]: 2025-12-06 09:39:29.397307498 +0000 UTC m=+0.196510160 container start 7d4cee0917fa854568a384399dea22bf6ded8f296932a2db4c62433b832cdb88 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-osd-0-activate-test, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_REF=squid, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS)
Dec  6 04:39:29 np0005548916 podman[77017]: 2025-12-06 09:39:29.401500852 +0000 UTC m=+0.200703554 container attach 7d4cee0917fa854568a384399dea22bf6ded8f296932a2db4c62433b832cdb88 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-osd-0-activate-test, org.label-schema.license=GPLv2, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Dec  6 04:39:29 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-osd-0-activate-test[77034]: usage: ceph-volume activate [-h] [--osd-id OSD_ID] [--osd-uuid OSD_FSID]
Dec  6 04:39:29 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-osd-0-activate-test[77034]:                            [--no-systemd] [--no-tmpfs]
Dec  6 04:39:29 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-osd-0-activate-test[77034]: ceph-volume activate: error: unrecognized arguments: --bad-option
Dec  6 04:39:29 np0005548916 systemd[1]: libpod-7d4cee0917fa854568a384399dea22bf6ded8f296932a2db4c62433b832cdb88.scope: Deactivated successfully.
Dec  6 04:39:29 np0005548916 podman[77017]: 2025-12-06 09:39:29.574223241 +0000 UTC m=+0.373425923 container died 7d4cee0917fa854568a384399dea22bf6ded8f296932a2db4c62433b832cdb88 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-osd-0-activate-test, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  6 04:39:29 np0005548916 systemd[1]: var-lib-containers-storage-overlay-5266eb2440c04747a6e446e4b53961da48954ef17a86adae1b17a8652f6bf5c9-merged.mount: Deactivated successfully.
Dec  6 04:39:29 np0005548916 podman[77017]: 2025-12-06 09:39:29.62609644 +0000 UTC m=+0.425299092 container remove 7d4cee0917fa854568a384399dea22bf6ded8f296932a2db4c62433b832cdb88 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-osd-0-activate-test, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  6 04:39:29 np0005548916 systemd[1]: libpod-conmon-7d4cee0917fa854568a384399dea22bf6ded8f296932a2db4c62433b832cdb88.scope: Deactivated successfully.
Dec  6 04:39:29 np0005548916 systemd[1]: Reloading.
Dec  6 04:39:30 np0005548916 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:39:30 np0005548916 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:39:30 np0005548916 systemd[1]: Reloading.
Dec  6 04:39:30 np0005548916 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:39:30 np0005548916 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:39:30 np0005548916 systemd[1]: Starting Ceph osd.0 for 5ecd3f74-dade-5fc4-92ce-8950ae424258...
Dec  6 04:39:30 np0005548916 podman[77194]: 2025-12-06 09:39:30.706403867 +0000 UTC m=+0.038481479 container create e6e4efa68513d96190cc0bb10d531d240c0cb225b8cb4d9f4d845af927a63019 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-osd-0-activate, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.build-date=20250325)
Dec  6 04:39:30 np0005548916 systemd[1]: Started libcrun container.
Dec  6 04:39:30 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d9a96d6d8008c97dcb54d61cc909df3f92a9e8ac260d827abc35f0239fa3c1e5/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  6 04:39:30 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d9a96d6d8008c97dcb54d61cc909df3f92a9e8ac260d827abc35f0239fa3c1e5/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  6 04:39:30 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d9a96d6d8008c97dcb54d61cc909df3f92a9e8ac260d827abc35f0239fa3c1e5/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  6 04:39:30 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d9a96d6d8008c97dcb54d61cc909df3f92a9e8ac260d827abc35f0239fa3c1e5/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  6 04:39:30 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d9a96d6d8008c97dcb54d61cc909df3f92a9e8ac260d827abc35f0239fa3c1e5/merged/var/lib/ceph/osd/ceph-0 supports timestamps until 2038 (0x7fffffff)
Dec  6 04:39:30 np0005548916 podman[77194]: 2025-12-06 09:39:30.690293691 +0000 UTC m=+0.022371323 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  6 04:39:30 np0005548916 podman[77194]: 2025-12-06 09:39:30.792224907 +0000 UTC m=+0.124302559 container init e6e4efa68513d96190cc0bb10d531d240c0cb225b8cb4d9f4d845af927a63019 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-osd-0-activate, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default)
Dec  6 04:39:30 np0005548916 podman[77194]: 2025-12-06 09:39:30.803515747 +0000 UTC m=+0.135593379 container start e6e4efa68513d96190cc0bb10d531d240c0cb225b8cb4d9f4d845af927a63019 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-osd-0-activate, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Dec  6 04:39:30 np0005548916 podman[77194]: 2025-12-06 09:39:30.852401143 +0000 UTC m=+0.184478855 container attach e6e4efa68513d96190cc0bb10d531d240c0cb225b8cb4d9f4d845af927a63019 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-osd-0-activate, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  6 04:39:30 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-osd-0-activate[77209]: Running command: /usr/bin/ceph-authtool --gen-print-key
Dec  6 04:39:30 np0005548916 bash[77194]: Running command: /usr/bin/ceph-authtool --gen-print-key
Dec  6 04:39:30 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-osd-0-activate[77209]: Running command: /usr/bin/ceph-authtool --gen-print-key
Dec  6 04:39:30 np0005548916 bash[77194]: Running command: /usr/bin/ceph-authtool --gen-print-key
Dec  6 04:39:31 np0005548916 lvm[77290]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec  6 04:39:31 np0005548916 lvm[77290]: VG ceph_vg0 finished
Dec  6 04:39:31 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-osd-0-activate[77209]: --> Failed to activate via raw: did not find any matching OSD to activate
Dec  6 04:39:31 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-osd-0-activate[77209]: Running command: /usr/bin/ceph-authtool --gen-print-key
Dec  6 04:39:31 np0005548916 bash[77194]: --> Failed to activate via raw: did not find any matching OSD to activate
Dec  6 04:39:31 np0005548916 bash[77194]: Running command: /usr/bin/ceph-authtool --gen-print-key
Dec  6 04:39:31 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-osd-0-activate[77209]: Running command: /usr/bin/ceph-authtool --gen-print-key
Dec  6 04:39:31 np0005548916 bash[77194]: Running command: /usr/bin/ceph-authtool --gen-print-key
Dec  6 04:39:31 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-osd-0-activate[77209]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0
Dec  6 04:39:31 np0005548916 bash[77194]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0
Dec  6 04:39:31 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-osd-0-activate[77209]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg0/ceph_lv0 --path /var/lib/ceph/osd/ceph-0 --no-mon-config
Dec  6 04:39:31 np0005548916 bash[77194]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg0/ceph_lv0 --path /var/lib/ceph/osd/ceph-0 --no-mon-config
Dec  6 04:39:32 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-osd-0-activate[77209]: Running command: /usr/bin/ln -snf /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-0/block
Dec  6 04:39:32 np0005548916 bash[77194]: Running command: /usr/bin/ln -snf /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-0/block
Dec  6 04:39:32 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-osd-0-activate[77209]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-0/block
Dec  6 04:39:32 np0005548916 bash[77194]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-0/block
Dec  6 04:39:32 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-osd-0-activate[77209]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Dec  6 04:39:32 np0005548916 bash[77194]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Dec  6 04:39:32 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-osd-0-activate[77209]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0
Dec  6 04:39:32 np0005548916 bash[77194]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0
Dec  6 04:39:32 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-osd-0-activate[77209]: --> ceph-volume lvm activate successful for osd ID: 0
Dec  6 04:39:32 np0005548916 bash[77194]: --> ceph-volume lvm activate successful for osd ID: 0
Dec  6 04:39:32 np0005548916 systemd[1]: libpod-e6e4efa68513d96190cc0bb10d531d240c0cb225b8cb4d9f4d845af927a63019.scope: Deactivated successfully.
Dec  6 04:39:32 np0005548916 systemd[1]: libpod-e6e4efa68513d96190cc0bb10d531d240c0cb225b8cb4d9f4d845af927a63019.scope: Consumed 1.597s CPU time.
Dec  6 04:39:32 np0005548916 podman[77384]: 2025-12-06 09:39:32.273619698 +0000 UTC m=+0.030391309 container died e6e4efa68513d96190cc0bb10d531d240c0cb225b8cb4d9f4d845af927a63019 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-osd-0-activate, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, OSD_FLAVOR=default, ceph=True, io.buildah.version=1.40.1)
Dec  6 04:39:32 np0005548916 systemd[1]: var-lib-containers-storage-overlay-d9a96d6d8008c97dcb54d61cc909df3f92a9e8ac260d827abc35f0239fa3c1e5-merged.mount: Deactivated successfully.
Dec  6 04:39:32 np0005548916 podman[77384]: 2025-12-06 09:39:32.357317396 +0000 UTC m=+0.114089037 container remove e6e4efa68513d96190cc0bb10d531d240c0cb225b8cb4d9f4d845af927a63019 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-osd-0-activate, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, ceph=True, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  6 04:39:32 np0005548916 podman[77446]: 2025-12-06 09:39:32.628229591 +0000 UTC m=+0.059621688 container create 0f0393491dd03b5ae266c2a248287651acc39819e0ff2bca59276136b4944860 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-osd-0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  6 04:39:32 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/73a9c203952191e0f9007cfcac9613bd129315c5989eb5abbe47ac3581205d5f/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  6 04:39:32 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/73a9c203952191e0f9007cfcac9613bd129315c5989eb5abbe47ac3581205d5f/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  6 04:39:32 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/73a9c203952191e0f9007cfcac9613bd129315c5989eb5abbe47ac3581205d5f/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  6 04:39:32 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/73a9c203952191e0f9007cfcac9613bd129315c5989eb5abbe47ac3581205d5f/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  6 04:39:32 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/73a9c203952191e0f9007cfcac9613bd129315c5989eb5abbe47ac3581205d5f/merged/var/lib/ceph/osd/ceph-0 supports timestamps until 2038 (0x7fffffff)
Dec  6 04:39:32 np0005548916 podman[77446]: 2025-12-06 09:39:32.596921761 +0000 UTC m=+0.028313848 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  6 04:39:32 np0005548916 podman[77446]: 2025-12-06 09:39:32.71631965 +0000 UTC m=+0.147711737 container init 0f0393491dd03b5ae266c2a248287651acc39819e0ff2bca59276136b4944860 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-osd-0, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325)
Dec  6 04:39:32 np0005548916 podman[77446]: 2025-12-06 09:39:32.723672224 +0000 UTC m=+0.155064321 container start 0f0393491dd03b5ae266c2a248287651acc39819e0ff2bca59276136b4944860 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-osd-0, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0)
Dec  6 04:39:32 np0005548916 bash[77446]: 0f0393491dd03b5ae266c2a248287651acc39819e0ff2bca59276136b4944860
Dec  6 04:39:32 np0005548916 systemd[1]: Started Ceph osd.0 for 5ecd3f74-dade-5fc4-92ce-8950ae424258.
Dec  6 04:39:32 np0005548916 ceph-osd[77465]: set uid:gid to 167:167 (ceph:ceph)
Dec  6 04:39:32 np0005548916 ceph-osd[77465]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-osd, pid 2
Dec  6 04:39:32 np0005548916 ceph-osd[77465]: pidfile_write: ignore empty --pid-file
Dec  6 04:39:32 np0005548916 ceph-osd[77465]: bdev(0x55fb22739800 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Dec  6 04:39:32 np0005548916 ceph-osd[77465]: bdev(0x55fb22739800 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Dec  6 04:39:32 np0005548916 ceph-osd[77465]: bdev(0x55fb22739800 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec  6 04:39:32 np0005548916 ceph-osd[77465]: bdev(0x55fb22739800 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec  6 04:39:32 np0005548916 ceph-osd[77465]: bdev(0x55fb22739800 /var/lib/ceph/osd/ceph-0/block) close
Dec  6 04:39:33 np0005548916 ceph-osd[77465]: bdev(0x55fb22739800 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Dec  6 04:39:33 np0005548916 ceph-osd[77465]: bdev(0x55fb22739800 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Dec  6 04:39:33 np0005548916 ceph-osd[77465]: bdev(0x55fb22739800 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec  6 04:39:33 np0005548916 ceph-osd[77465]: bdev(0x55fb22739800 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec  6 04:39:33 np0005548916 ceph-osd[77465]: bdev(0x55fb22739800 /var/lib/ceph/osd/ceph-0/block) close
Dec  6 04:39:33 np0005548916 ceph-osd[77465]: bdev(0x55fb22739800 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Dec  6 04:39:33 np0005548916 ceph-osd[77465]: bdev(0x55fb22739800 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Dec  6 04:39:33 np0005548916 ceph-osd[77465]: bdev(0x55fb22739800 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec  6 04:39:33 np0005548916 ceph-osd[77465]: bdev(0x55fb22739800 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec  6 04:39:33 np0005548916 ceph-osd[77465]: bdev(0x55fb22739800 /var/lib/ceph/osd/ceph-0/block) close
Dec  6 04:39:33 np0005548916 ceph-osd[77465]: bdev(0x55fb22739800 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Dec  6 04:39:33 np0005548916 ceph-osd[77465]: bdev(0x55fb22739800 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Dec  6 04:39:33 np0005548916 ceph-osd[77465]: bdev(0x55fb22739800 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec  6 04:39:33 np0005548916 ceph-osd[77465]: bdev(0x55fb22739800 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec  6 04:39:33 np0005548916 ceph-osd[77465]: bdev(0x55fb22739800 /var/lib/ceph/osd/ceph-0/block) close
Dec  6 04:39:33 np0005548916 ceph-osd[77465]: bdev(0x55fb22739800 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Dec  6 04:39:33 np0005548916 ceph-osd[77465]: bdev(0x55fb22739800 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Dec  6 04:39:33 np0005548916 ceph-osd[77465]: bdev(0x55fb22739800 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec  6 04:39:33 np0005548916 ceph-osd[77465]: bdev(0x55fb22739800 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec  6 04:39:33 np0005548916 ceph-osd[77465]: bdev(0x55fb22739800 /var/lib/ceph/osd/ceph-0/block) close
Dec  6 04:39:33 np0005548916 podman[77575]: 2025-12-06 09:39:33.496517764 +0000 UTC m=+0.117992672 container create 3a9a52a6044f2e3916cd1722e40a7034868324e9773461be98fa4b31ea66220d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=clever_cannon, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec  6 04:39:33 np0005548916 podman[77575]: 2025-12-06 09:39:33.405337438 +0000 UTC m=+0.026812386 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  6 04:39:33 np0005548916 systemd[1]: Started libpod-conmon-3a9a52a6044f2e3916cd1722e40a7034868324e9773461be98fa4b31ea66220d.scope.
Dec  6 04:39:33 np0005548916 systemd[1]: Started libcrun container.
Dec  6 04:39:33 np0005548916 podman[77575]: 2025-12-06 09:39:33.599813437 +0000 UTC m=+0.221288425 container init 3a9a52a6044f2e3916cd1722e40a7034868324e9773461be98fa4b31ea66220d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=clever_cannon, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  6 04:39:33 np0005548916 podman[77575]: 2025-12-06 09:39:33.612581067 +0000 UTC m=+0.234056015 container start 3a9a52a6044f2e3916cd1722e40a7034868324e9773461be98fa4b31ea66220d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=clever_cannon, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  6 04:39:33 np0005548916 podman[77575]: 2025-12-06 09:39:33.616402779 +0000 UTC m=+0.237877707 container attach 3a9a52a6044f2e3916cd1722e40a7034868324e9773461be98fa4b31ea66220d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=clever_cannon, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid)
Dec  6 04:39:33 np0005548916 clever_cannon[77594]: 167 167
Dec  6 04:39:33 np0005548916 systemd[1]: libpod-3a9a52a6044f2e3916cd1722e40a7034868324e9773461be98fa4b31ea66220d.scope: Deactivated successfully.
Dec  6 04:39:33 np0005548916 conmon[77594]: conmon 3a9a52a6044f2e3916cd <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-3a9a52a6044f2e3916cd1722e40a7034868324e9773461be98fa4b31ea66220d.scope/container/memory.events
Dec  6 04:39:33 np0005548916 podman[77575]: 2025-12-06 09:39:33.623622948 +0000 UTC m=+0.245097876 container died 3a9a52a6044f2e3916cd1722e40a7034868324e9773461be98fa4b31ea66220d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=clever_cannon, ceph=True, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0)
Dec  6 04:39:33 np0005548916 systemd[1]: var-lib-containers-storage-overlay-c0d084adbb910e2d84183d32f9845872f41323414d21fe37ecb727234b4c5dc6-merged.mount: Deactivated successfully.
Dec  6 04:39:33 np0005548916 ceph-osd[77465]: bdev(0x55fb22739800 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Dec  6 04:39:33 np0005548916 ceph-osd[77465]: bdev(0x55fb22739800 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Dec  6 04:39:33 np0005548916 ceph-osd[77465]: bdev(0x55fb22739800 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec  6 04:39:33 np0005548916 ceph-osd[77465]: bdev(0x55fb22739800 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec  6 04:39:33 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Dec  6 04:39:33 np0005548916 ceph-osd[77465]: bdev(0x55fb22739c00 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Dec  6 04:39:33 np0005548916 ceph-osd[77465]: bdev(0x55fb22739c00 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Dec  6 04:39:33 np0005548916 ceph-osd[77465]: bdev(0x55fb22739c00 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec  6 04:39:33 np0005548916 ceph-osd[77465]: bdev(0x55fb22739c00 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec  6 04:39:33 np0005548916 ceph-osd[77465]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-0/block size 20 GiB
Dec  6 04:39:33 np0005548916 ceph-osd[77465]: bdev(0x55fb22739c00 /var/lib/ceph/osd/ceph-0/block) close
Dec  6 04:39:33 np0005548916 podman[77575]: 2025-12-06 09:39:33.796233523 +0000 UTC m=+0.417708441 container remove 3a9a52a6044f2e3916cd1722e40a7034868324e9773461be98fa4b31ea66220d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=clever_cannon, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, OSD_FLAVOR=default, io.buildah.version=1.40.1)
Dec  6 04:39:33 np0005548916 systemd[1]: libpod-conmon-3a9a52a6044f2e3916cd1722e40a7034868324e9773461be98fa4b31ea66220d.scope: Deactivated successfully.
Dec  6 04:39:33 np0005548916 ceph-osd[77465]: bdev(0x55fb22739800 /var/lib/ceph/osd/ceph-0/block) close
Dec  6 04:39:34 np0005548916 podman[77622]: 2025-12-06 09:39:33.996526262 +0000 UTC m=+0.029263170 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  6 04:39:34 np0005548916 ceph-osd[77465]: starting osd.0 osd_data /var/lib/ceph/osd/ceph-0 /var/lib/ceph/osd/ceph-0/journal
Dec  6 04:39:34 np0005548916 ceph-osd[77465]: load: jerasure load: lrc 
Dec  6 04:39:34 np0005548916 ceph-osd[77465]: bdev(0x55fb235d4c00 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Dec  6 04:39:34 np0005548916 ceph-osd[77465]: bdev(0x55fb235d4c00 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Dec  6 04:39:34 np0005548916 ceph-osd[77465]: bdev(0x55fb235d4c00 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec  6 04:39:34 np0005548916 ceph-osd[77465]: bdev(0x55fb235d4c00 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec  6 04:39:34 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Dec  6 04:39:34 np0005548916 ceph-osd[77465]: bdev(0x55fb235d4c00 /var/lib/ceph/osd/ceph-0/block) close
Dec  6 04:39:34 np0005548916 podman[77622]: 2025-12-06 09:39:34.285239743 +0000 UTC m=+0.317976631 container create 6d886781c1c78dea24fdd24bd130f18ff9f5e88c380174c752ec15b7e1c5a59a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=festive_buck, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True)
Dec  6 04:39:34 np0005548916 systemd[1]: Started libpod-conmon-6d886781c1c78dea24fdd24bd130f18ff9f5e88c380174c752ec15b7e1c5a59a.scope.
Dec  6 04:39:34 np0005548916 systemd[1]: Started libcrun container.
Dec  6 04:39:34 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/88c6814e6735576501c510f4f224d0b84b986d97970aa23c05b39f2d6f54f09a/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  6 04:39:34 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/88c6814e6735576501c510f4f224d0b84b986d97970aa23c05b39f2d6f54f09a/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  6 04:39:34 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/88c6814e6735576501c510f4f224d0b84b986d97970aa23c05b39f2d6f54f09a/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  6 04:39:34 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/88c6814e6735576501c510f4f224d0b84b986d97970aa23c05b39f2d6f54f09a/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  6 04:39:34 np0005548916 ceph-osd[77465]: bdev(0x55fb235d4c00 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Dec  6 04:39:34 np0005548916 ceph-osd[77465]: bdev(0x55fb235d4c00 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Dec  6 04:39:34 np0005548916 ceph-osd[77465]: bdev(0x55fb235d4c00 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec  6 04:39:34 np0005548916 ceph-osd[77465]: bdev(0x55fb235d4c00 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec  6 04:39:34 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Dec  6 04:39:34 np0005548916 ceph-osd[77465]: bdev(0x55fb235d4c00 /var/lib/ceph/osd/ceph-0/block) close
Dec  6 04:39:34 np0005548916 podman[77622]: 2025-12-06 09:39:34.599388059 +0000 UTC m=+0.632125037 container init 6d886781c1c78dea24fdd24bd130f18ff9f5e88c380174c752ec15b7e1c5a59a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=festive_buck, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_REF=squid, OSD_FLAVOR=default)
Dec  6 04:39:34 np0005548916 podman[77622]: 2025-12-06 09:39:34.615748563 +0000 UTC m=+0.648485481 container start 6d886781c1c78dea24fdd24bd130f18ff9f5e88c380174c752ec15b7e1c5a59a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=festive_buck, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  6 04:39:34 np0005548916 podman[77622]: 2025-12-06 09:39:34.621482761 +0000 UTC m=+0.654219659 container attach 6d886781c1c78dea24fdd24bd130f18ff9f5e88c380174c752ec15b7e1c5a59a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=festive_buck, org.label-schema.build-date=20250325, ceph=True, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, OSD_FLAVOR=default)
Dec  6 04:39:34 np0005548916 ceph-osd[77465]: mClockScheduler: set_osd_capacity_params_from_config: osd_bandwidth_cost_per_io: 499321.90 bytes/io, osd_bandwidth_capacity_per_shard 157286400.00 bytes/second
Dec  6 04:39:34 np0005548916 ceph-osd[77465]: osd.0:0.OSDShard using op scheduler mclock_scheduler, cutoff=196
Dec  6 04:39:34 np0005548916 ceph-osd[77465]: bdev(0x55fb235d4c00 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Dec  6 04:39:34 np0005548916 ceph-osd[77465]: bdev(0x55fb235d4c00 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Dec  6 04:39:34 np0005548916 ceph-osd[77465]: bdev(0x55fb235d4c00 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec  6 04:39:34 np0005548916 ceph-osd[77465]: bdev(0x55fb235d4c00 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec  6 04:39:34 np0005548916 ceph-osd[77465]: bdev(0x55fb235d4c00 /var/lib/ceph/osd/ceph-0/block) close
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: bdev(0x55fb235d4c00 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: bdev(0x55fb235d4c00 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: bdev(0x55fb235d4c00 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: bdev(0x55fb235d4c00 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: bdev(0x55fb235d4c00 /var/lib/ceph/osd/ceph-0/block) close
Dec  6 04:39:35 np0005548916 lvm[77732]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec  6 04:39:35 np0005548916 lvm[77732]: VG ceph_vg0 finished
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: bdev(0x55fb235d4c00 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: bdev(0x55fb235d4c00 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: bdev(0x55fb235d4c00 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: bdev(0x55fb235d4c00 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: bdev(0x55fb235d4c00 /var/lib/ceph/osd/ceph-0/block) close
Dec  6 04:39:35 np0005548916 lvm[77737]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec  6 04:39:35 np0005548916 lvm[77737]: VG ceph_vg0 finished
Dec  6 04:39:35 np0005548916 festive_buck[77643]: {}
Dec  6 04:39:35 np0005548916 systemd[1]: libpod-6d886781c1c78dea24fdd24bd130f18ff9f5e88c380174c752ec15b7e1c5a59a.scope: Deactivated successfully.
Dec  6 04:39:35 np0005548916 systemd[1]: libpod-6d886781c1c78dea24fdd24bd130f18ff9f5e88c380174c752ec15b7e1c5a59a.scope: Consumed 1.278s CPU time.
Dec  6 04:39:35 np0005548916 podman[77622]: 2025-12-06 09:39:35.421787138 +0000 UTC m=+1.454524016 container died 6d886781c1c78dea24fdd24bd130f18ff9f5e88c380174c752ec15b7e1c5a59a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=festive_buck, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325)
Dec  6 04:39:35 np0005548916 systemd[1]: var-lib-containers-storage-overlay-88c6814e6735576501c510f4f224d0b84b986d97970aa23c05b39f2d6f54f09a-merged.mount: Deactivated successfully.
Dec  6 04:39:35 np0005548916 podman[77622]: 2025-12-06 09:39:35.480253404 +0000 UTC m=+1.512990282 container remove 6d886781c1c78dea24fdd24bd130f18ff9f5e88c380174c752ec15b7e1c5a59a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=festive_buck, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, ceph=True, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  6 04:39:35 np0005548916 systemd[1]: libpod-conmon-6d886781c1c78dea24fdd24bd130f18ff9f5e88c380174c752ec15b7e1c5a59a.scope: Deactivated successfully.
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: bdev(0x55fb235d4c00 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: bdev(0x55fb235d4c00 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: bdev(0x55fb235d4c00 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: bdev(0x55fb235d4c00 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: bdev(0x55fb235d5000 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: bdev(0x55fb235d5000 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: bdev(0x55fb235d5000 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: bdev(0x55fb235d5000 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-0/block size 20 GiB
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: bluefs mount
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: bluefs _init_alloc shared, id 1, capacity 0x4ffc00000, block size 0x10000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: bluefs mount final locked allocations 0 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: bluefs mount final locked allocations 1 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: bluefs mount final locked allocations 2 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: bluefs mount final locked allocations 3 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: bluefs mount final locked allocations 4 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: bluefs mount shared_bdev_used = 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _prepare_db_environment set db_paths to db,20397110067 db.slow,20397110067
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: RocksDB version: 7.9.2
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Git sha 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Compile date 2025-07-17 03:12:14
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: DB SUMMARY
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: DB Session ID:  FMGV1GCT5BLWAHJBE977
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: CURRENT file:  CURRENT
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: IDENTITY file:  IDENTITY
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5097 ; 
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                         Options.error_if_exists: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                       Options.create_if_missing: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                         Options.paranoid_checks: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:             Options.flush_verify_memtable_count: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                                     Options.env: 0x55fb235a5dc0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                                      Options.fs: LegacyFileSystem
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                                Options.info_log: 0x55fb235a97a0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.max_file_opening_threads: 16
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                              Options.statistics: (nil)
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                               Options.use_fsync: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                       Options.max_log_file_size: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                   Options.log_file_time_to_roll: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                       Options.keep_log_file_num: 1000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                    Options.recycle_log_file_num: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                         Options.allow_fallocate: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                        Options.allow_mmap_reads: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                       Options.allow_mmap_writes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                        Options.use_direct_reads: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.create_missing_column_families: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                              Options.db_log_dir: 
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                                 Options.wal_dir: db.wal
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.table_cache_numshardbits: 6
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                         Options.WAL_ttl_seconds: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                       Options.WAL_size_limit_MB: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:             Options.manifest_preallocation_size: 4194304
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                     Options.is_fd_close_on_exec: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                   Options.advise_random_on_open: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                    Options.db_write_buffer_size: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                    Options.write_buffer_manager: 0x55fb2369ea00
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.access_hint_on_compaction_start: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                      Options.use_adaptive_mutex: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                            Options.rate_limiter: (nil)
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                       Options.wal_recovery_mode: 2
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                  Options.enable_thread_tracking: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                  Options.enable_pipelined_write: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                  Options.unordered_write: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:             Options.write_thread_max_yield_usec: 100
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                               Options.row_cache: None
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                              Options.wal_filter: None
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:             Options.avoid_flush_during_recovery: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:             Options.allow_ingest_behind: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:             Options.two_write_queues: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:             Options.manual_wal_flush: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:             Options.wal_compression: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:             Options.atomic_flush: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                 Options.persist_stats_to_disk: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                 Options.write_dbid_to_manifest: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                 Options.log_readahead_size: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                 Options.best_efforts_recovery: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:             Options.allow_data_in_errors: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:             Options.db_host_id: __hostname__
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:             Options.enforce_single_del_contracts: true
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:             Options.max_background_jobs: 4
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:             Options.max_background_compactions: -1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:             Options.max_subcompactions: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:           Options.writable_file_max_buffer_size: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:             Options.delayed_write_rate : 16777216
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:             Options.max_total_wal_size: 1073741824
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                   Options.stats_dump_period_sec: 600
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                 Options.stats_persist_period_sec: 600
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                          Options.max_open_files: -1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                          Options.bytes_per_sync: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                      Options.wal_bytes_per_sync: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                   Options.strict_bytes_per_sync: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:       Options.compaction_readahead_size: 2097152
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                  Options.max_background_flushes: -1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Compression algorithms supported:
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: #011kZSTD supported: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: #011kXpressCompression supported: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: #011kBZip2Compression supported: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: #011kZSTDNotFinalCompression supported: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: #011kLZ4Compression supported: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: #011kZlibCompression supported: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: #011kLZ4HCCompression supported: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: #011kSnappyCompression supported: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Fast CRC32 supported: Supported on x86
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: DMutex implementation: pthread_mutex_t
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: [db/db_impl/db_impl_readonly.cc:25] Opening the db in read only mode
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:        Options.compaction_filter: None
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:        Options.compaction_filter_factory: None
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:  Options.sst_partitioner_factory: None
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55fb235a9b60)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55fb227cf350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:        Options.write_buffer_size: 16777216
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:  Options.max_write_buffer_number: 64
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.compression: LZ4
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:       Options.prefix_extractor: nullptr
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:             Options.num_levels: 7
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                  Options.compression_opts.level: 32767
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:               Options.compression_opts.strategy: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                  Options.compression_opts.enabled: false
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                        Options.arena_block_size: 1048576
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.disable_auto_compactions: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                   Options.inplace_update_support: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                           Options.bloom_locality: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                    Options.max_successive_merges: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.paranoid_file_checks: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.force_consistency_checks: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.report_bg_io_stats: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                               Options.ttl: 2592000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                       Options.enable_blob_files: false
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                           Options.min_blob_size: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                          Options.blob_file_size: 268435456
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.blob_file_starting_level: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:           Options.merge_operator: None
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:        Options.compaction_filter: None
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:        Options.compaction_filter_factory: None
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:  Options.sst_partitioner_factory: None
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55fb235a9b60)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55fb227cf350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:        Options.write_buffer_size: 16777216
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:  Options.max_write_buffer_number: 64
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.compression: LZ4
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:       Options.prefix_extractor: nullptr
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:             Options.num_levels: 7
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                  Options.compression_opts.level: 32767
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:               Options.compression_opts.strategy: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                  Options.compression_opts.enabled: false
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                        Options.arena_block_size: 1048576
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.disable_auto_compactions: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                   Options.inplace_update_support: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                           Options.bloom_locality: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                    Options.max_successive_merges: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.paranoid_file_checks: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.force_consistency_checks: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.report_bg_io_stats: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                               Options.ttl: 2592000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                       Options.enable_blob_files: false
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                           Options.min_blob_size: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                          Options.blob_file_size: 268435456
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.blob_file_starting_level: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:           Options.merge_operator: None
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:        Options.compaction_filter: None
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:        Options.compaction_filter_factory: None
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:  Options.sst_partitioner_factory: None
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55fb235a9b60)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55fb227cf350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:        Options.write_buffer_size: 16777216
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:  Options.max_write_buffer_number: 64
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.compression: LZ4
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:       Options.prefix_extractor: nullptr
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:             Options.num_levels: 7
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                  Options.compression_opts.level: 32767
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:               Options.compression_opts.strategy: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                  Options.compression_opts.enabled: false
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                        Options.arena_block_size: 1048576
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.disable_auto_compactions: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                   Options.inplace_update_support: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                           Options.bloom_locality: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                    Options.max_successive_merges: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.paranoid_file_checks: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.force_consistency_checks: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.report_bg_io_stats: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                               Options.ttl: 2592000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                       Options.enable_blob_files: false
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                           Options.min_blob_size: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                          Options.blob_file_size: 268435456
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.blob_file_starting_level: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:           Options.merge_operator: None
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:        Options.compaction_filter: None
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:        Options.compaction_filter_factory: None
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:  Options.sst_partitioner_factory: None
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55fb235a9b60)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55fb227cf350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:        Options.write_buffer_size: 16777216
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:  Options.max_write_buffer_number: 64
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.compression: LZ4
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:       Options.prefix_extractor: nullptr
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:             Options.num_levels: 7
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                  Options.compression_opts.level: 32767
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:               Options.compression_opts.strategy: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                  Options.compression_opts.enabled: false
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                        Options.arena_block_size: 1048576
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.disable_auto_compactions: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                   Options.inplace_update_support: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                           Options.bloom_locality: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                    Options.max_successive_merges: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.paranoid_file_checks: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.force_consistency_checks: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.report_bg_io_stats: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                               Options.ttl: 2592000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                       Options.enable_blob_files: false
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                           Options.min_blob_size: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                          Options.blob_file_size: 268435456
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.blob_file_starting_level: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:           Options.merge_operator: None
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:        Options.compaction_filter: None
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:        Options.compaction_filter_factory: None
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:  Options.sst_partitioner_factory: None
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55fb235a9b60)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55fb227cf350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:        Options.write_buffer_size: 16777216
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:  Options.max_write_buffer_number: 64
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.compression: LZ4
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:       Options.prefix_extractor: nullptr
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:             Options.num_levels: 7
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                  Options.compression_opts.level: 32767
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:               Options.compression_opts.strategy: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                  Options.compression_opts.enabled: false
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                        Options.arena_block_size: 1048576
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.disable_auto_compactions: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                   Options.inplace_update_support: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                           Options.bloom_locality: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                    Options.max_successive_merges: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.paranoid_file_checks: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.force_consistency_checks: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.report_bg_io_stats: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                               Options.ttl: 2592000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                       Options.enable_blob_files: false
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                           Options.min_blob_size: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                          Options.blob_file_size: 268435456
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.blob_file_starting_level: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:           Options.merge_operator: None
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:        Options.compaction_filter: None
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:        Options.compaction_filter_factory: None
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:  Options.sst_partitioner_factory: None
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55fb235a9b60)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55fb227cf350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:        Options.write_buffer_size: 16777216
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:  Options.max_write_buffer_number: 64
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.compression: LZ4
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:       Options.prefix_extractor: nullptr
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:             Options.num_levels: 7
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                  Options.compression_opts.level: 32767
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:               Options.compression_opts.strategy: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                  Options.compression_opts.enabled: false
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                        Options.arena_block_size: 1048576
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.disable_auto_compactions: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                   Options.inplace_update_support: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                           Options.bloom_locality: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                    Options.max_successive_merges: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.paranoid_file_checks: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.force_consistency_checks: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.report_bg_io_stats: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                               Options.ttl: 2592000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                       Options.enable_blob_files: false
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                           Options.min_blob_size: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                          Options.blob_file_size: 268435456
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.blob_file_starting_level: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:           Options.merge_operator: None
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:        Options.compaction_filter: None
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:        Options.compaction_filter_factory: None
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:  Options.sst_partitioner_factory: None
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55fb235a9b60)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55fb227cf350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:        Options.write_buffer_size: 16777216
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:  Options.max_write_buffer_number: 64
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.compression: LZ4
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:       Options.prefix_extractor: nullptr
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:             Options.num_levels: 7
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                  Options.compression_opts.level: 32767
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:               Options.compression_opts.strategy: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                  Options.compression_opts.enabled: false
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                        Options.arena_block_size: 1048576
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.disable_auto_compactions: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                   Options.inplace_update_support: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                           Options.bloom_locality: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                    Options.max_successive_merges: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.paranoid_file_checks: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.force_consistency_checks: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.report_bg_io_stats: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                               Options.ttl: 2592000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                       Options.enable_blob_files: false
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                           Options.min_blob_size: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                          Options.blob_file_size: 268435456
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.blob_file_starting_level: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:           Options.merge_operator: None
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:        Options.compaction_filter: None
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:        Options.compaction_filter_factory: None
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:  Options.sst_partitioner_factory: None
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55fb235a9b80)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55fb227ce9b0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:        Options.write_buffer_size: 16777216
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:  Options.max_write_buffer_number: 64
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.compression: LZ4
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:       Options.prefix_extractor: nullptr
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:             Options.num_levels: 7
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                  Options.compression_opts.level: 32767
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:               Options.compression_opts.strategy: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                  Options.compression_opts.enabled: false
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                        Options.arena_block_size: 1048576
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.disable_auto_compactions: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                   Options.inplace_update_support: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                           Options.bloom_locality: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                    Options.max_successive_merges: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.paranoid_file_checks: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.force_consistency_checks: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.report_bg_io_stats: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                               Options.ttl: 2592000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                       Options.enable_blob_files: false
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                           Options.min_blob_size: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                          Options.blob_file_size: 268435456
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.blob_file_starting_level: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:           Options.merge_operator: None
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:        Options.compaction_filter: None
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:        Options.compaction_filter_factory: None
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:  Options.sst_partitioner_factory: None
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55fb235a9b80)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55fb227ce9b0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:        Options.write_buffer_size: 16777216
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:  Options.max_write_buffer_number: 64
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.compression: LZ4
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:       Options.prefix_extractor: nullptr
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:             Options.num_levels: 7
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                  Options.compression_opts.level: 32767
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:               Options.compression_opts.strategy: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                  Options.compression_opts.enabled: false
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                        Options.arena_block_size: 1048576
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.disable_auto_compactions: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                   Options.inplace_update_support: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                           Options.bloom_locality: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                    Options.max_successive_merges: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.paranoid_file_checks: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.force_consistency_checks: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.report_bg_io_stats: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                               Options.ttl: 2592000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                       Options.enable_blob_files: false
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                           Options.min_blob_size: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                          Options.blob_file_size: 268435456
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.blob_file_starting_level: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:           Options.merge_operator: None
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:        Options.compaction_filter: None
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:        Options.compaction_filter_factory: None
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:  Options.sst_partitioner_factory: None
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55fb235a9b80)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55fb227ce9b0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:        Options.write_buffer_size: 16777216
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:  Options.max_write_buffer_number: 64
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.compression: LZ4
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:       Options.prefix_extractor: nullptr
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:             Options.num_levels: 7
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                  Options.compression_opts.level: 32767
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:               Options.compression_opts.strategy: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                  Options.compression_opts.enabled: false
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                        Options.arena_block_size: 1048576
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.disable_auto_compactions: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                   Options.inplace_update_support: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                           Options.bloom_locality: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                    Options.max_successive_merges: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.paranoid_file_checks: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.force_consistency_checks: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.report_bg_io_stats: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                               Options.ttl: 2592000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                       Options.enable_blob_files: false
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                           Options.min_blob_size: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                          Options.blob_file_size: 268435456
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.blob_file_starting_level: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: [db/column_family.cc:635] #011(skipping printing options)
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: [db/column_family.cc:635] #011(skipping printing options)
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: af557f43-9483-4a65-96a9-1d3a8a4b0b2d
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765013975639798, "job": 1, "event": "recovery_started", "wal_files": [31]}
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765013975640237, "job": 1, "event": "recovery_finished"}
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _open_super_meta old nid_max 1025
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _open_super_meta old blobid_max 10240
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _open_super_meta ondisk_format 4 compat_ondisk_format 3
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _open_super_meta min_alloc_size 0x1000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: freelist init
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: freelist _read_cfg
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _init_alloc loaded 20 GiB in 2 extents, allocator type hybrid, capacity 0x4ffc00000, block size 0x1000, free 0x4ffbfd000, fragmentation 1.9e-07
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: [db/db_impl/db_impl.cc:496] Shutdown: canceling all background work
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: [db/db_impl/db_impl.cc:704] Shutdown complete
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: bluefs umount
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: bdev(0x55fb235d5000 /var/lib/ceph/osd/ceph-0/block) close
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: bdev(0x55fb235d5000 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: bdev(0x55fb235d5000 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: bdev(0x55fb235d5000 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: bdev(0x55fb235d5000 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-0/block size 20 GiB
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: bluefs mount
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: bluefs _init_alloc shared, id 1, capacity 0x4ffc00000, block size 0x10000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: bluefs mount final locked allocations 0 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: bluefs mount final locked allocations 1 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: bluefs mount final locked allocations 2 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: bluefs mount final locked allocations 3 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: bluefs mount final locked allocations 4 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: bluefs mount shared_bdev_used = 4718592
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _prepare_db_environment set db_paths to db,20397110067 db.slow,20397110067
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: RocksDB version: 7.9.2
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Git sha 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Compile date 2025-07-17 03:12:14
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: DB SUMMARY
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: DB Session ID:  FMGV1GCT5BLWAHJBE976
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: CURRENT file:  CURRENT
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: IDENTITY file:  IDENTITY
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5097 ; 
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                         Options.error_if_exists: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                       Options.create_if_missing: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                         Options.paranoid_checks: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:             Options.flush_verify_memtable_count: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                                     Options.env: 0x55fb23742310
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                                      Options.fs: LegacyFileSystem
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                                Options.info_log: 0x55fb2287c7c0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.max_file_opening_threads: 16
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                              Options.statistics: (nil)
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                               Options.use_fsync: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                       Options.max_log_file_size: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                   Options.log_file_time_to_roll: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                       Options.keep_log_file_num: 1000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                    Options.recycle_log_file_num: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                         Options.allow_fallocate: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                        Options.allow_mmap_reads: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                       Options.allow_mmap_writes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                        Options.use_direct_reads: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.create_missing_column_families: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                              Options.db_log_dir: 
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                                 Options.wal_dir: db.wal
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.table_cache_numshardbits: 6
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                         Options.WAL_ttl_seconds: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                       Options.WAL_size_limit_MB: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:             Options.manifest_preallocation_size: 4194304
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                     Options.is_fd_close_on_exec: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                   Options.advise_random_on_open: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                    Options.db_write_buffer_size: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                    Options.write_buffer_manager: 0x55fb2369ea00
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.access_hint_on_compaction_start: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                      Options.use_adaptive_mutex: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                            Options.rate_limiter: (nil)
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                       Options.wal_recovery_mode: 2
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                  Options.enable_thread_tracking: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                  Options.enable_pipelined_write: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                  Options.unordered_write: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:             Options.write_thread_max_yield_usec: 100
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                               Options.row_cache: None
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                              Options.wal_filter: None
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:             Options.avoid_flush_during_recovery: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:             Options.allow_ingest_behind: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:             Options.two_write_queues: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:             Options.manual_wal_flush: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:             Options.wal_compression: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:             Options.atomic_flush: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                 Options.persist_stats_to_disk: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                 Options.write_dbid_to_manifest: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                 Options.log_readahead_size: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                 Options.best_efforts_recovery: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:             Options.allow_data_in_errors: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:             Options.db_host_id: __hostname__
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:             Options.enforce_single_del_contracts: true
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:             Options.max_background_jobs: 4
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:             Options.max_background_compactions: -1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:             Options.max_subcompactions: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:           Options.writable_file_max_buffer_size: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:             Options.delayed_write_rate : 16777216
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:             Options.max_total_wal_size: 1073741824
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                   Options.stats_dump_period_sec: 600
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                 Options.stats_persist_period_sec: 600
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                          Options.max_open_files: -1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                          Options.bytes_per_sync: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                      Options.wal_bytes_per_sync: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                   Options.strict_bytes_per_sync: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:       Options.compaction_readahead_size: 2097152
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                  Options.max_background_flushes: -1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Compression algorithms supported:
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: #011kZSTD supported: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: #011kXpressCompression supported: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: #011kBZip2Compression supported: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: #011kZSTDNotFinalCompression supported: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: #011kLZ4Compression supported: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: #011kZlibCompression supported: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: #011kLZ4HCCompression supported: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: #011kSnappyCompression supported: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Fast CRC32 supported: Supported on x86
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: DMutex implementation: pthread_mutex_t
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:        Options.compaction_filter: None
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:        Options.compaction_filter_factory: None
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:  Options.sst_partitioner_factory: None
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55fb235a9680)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55fb227cf350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:        Options.write_buffer_size: 16777216
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:  Options.max_write_buffer_number: 64
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.compression: LZ4
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:       Options.prefix_extractor: nullptr
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:             Options.num_levels: 7
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                  Options.compression_opts.level: 32767
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:               Options.compression_opts.strategy: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                  Options.compression_opts.enabled: false
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                        Options.arena_block_size: 1048576
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.disable_auto_compactions: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                   Options.inplace_update_support: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                           Options.bloom_locality: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                    Options.max_successive_merges: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.paranoid_file_checks: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.force_consistency_checks: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.report_bg_io_stats: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                               Options.ttl: 2592000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                       Options.enable_blob_files: false
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                           Options.min_blob_size: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                          Options.blob_file_size: 268435456
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.blob_file_starting_level: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:           Options.merge_operator: None
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:        Options.compaction_filter: None
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:        Options.compaction_filter_factory: None
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:  Options.sst_partitioner_factory: None
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55fb235a9680)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55fb227cf350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:        Options.write_buffer_size: 16777216
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:  Options.max_write_buffer_number: 64
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.compression: LZ4
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:       Options.prefix_extractor: nullptr
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:             Options.num_levels: 7
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                  Options.compression_opts.level: 32767
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:               Options.compression_opts.strategy: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                  Options.compression_opts.enabled: false
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                        Options.arena_block_size: 1048576
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.disable_auto_compactions: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                   Options.inplace_update_support: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                           Options.bloom_locality: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                    Options.max_successive_merges: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.paranoid_file_checks: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.force_consistency_checks: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.report_bg_io_stats: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                               Options.ttl: 2592000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                       Options.enable_blob_files: false
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                           Options.min_blob_size: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                          Options.blob_file_size: 268435456
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.blob_file_starting_level: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:           Options.merge_operator: None
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:        Options.compaction_filter: None
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:        Options.compaction_filter_factory: None
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:  Options.sst_partitioner_factory: None
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55fb235a9680)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55fb227cf350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:        Options.write_buffer_size: 16777216
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:  Options.max_write_buffer_number: 64
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.compression: LZ4
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:       Options.prefix_extractor: nullptr
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:             Options.num_levels: 7
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                  Options.compression_opts.level: 32767
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:               Options.compression_opts.strategy: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                  Options.compression_opts.enabled: false
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                        Options.arena_block_size: 1048576
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.disable_auto_compactions: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                   Options.inplace_update_support: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                           Options.bloom_locality: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                    Options.max_successive_merges: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.paranoid_file_checks: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.force_consistency_checks: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.report_bg_io_stats: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                               Options.ttl: 2592000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                       Options.enable_blob_files: false
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                           Options.min_blob_size: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                          Options.blob_file_size: 268435456
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.blob_file_starting_level: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:           Options.merge_operator: None
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:        Options.compaction_filter: None
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:        Options.compaction_filter_factory: None
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:  Options.sst_partitioner_factory: None
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55fb235a9680)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55fb227cf350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:        Options.write_buffer_size: 16777216
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:  Options.max_write_buffer_number: 64
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.compression: LZ4
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:       Options.prefix_extractor: nullptr
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:             Options.num_levels: 7
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                  Options.compression_opts.level: 32767
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:               Options.compression_opts.strategy: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                  Options.compression_opts.enabled: false
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                        Options.arena_block_size: 1048576
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.disable_auto_compactions: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                   Options.inplace_update_support: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                           Options.bloom_locality: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                    Options.max_successive_merges: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.paranoid_file_checks: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.force_consistency_checks: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.report_bg_io_stats: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                               Options.ttl: 2592000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                       Options.enable_blob_files: false
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                           Options.min_blob_size: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                          Options.blob_file_size: 268435456
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.blob_file_starting_level: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:           Options.merge_operator: None
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:        Options.compaction_filter: None
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:        Options.compaction_filter_factory: None
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:  Options.sst_partitioner_factory: None
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55fb235a9680)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55fb227cf350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:        Options.write_buffer_size: 16777216
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:  Options.max_write_buffer_number: 64
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.compression: LZ4
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:       Options.prefix_extractor: nullptr
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:             Options.num_levels: 7
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                  Options.compression_opts.level: 32767
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:               Options.compression_opts.strategy: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                  Options.compression_opts.enabled: false
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                        Options.arena_block_size: 1048576
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.disable_auto_compactions: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                   Options.inplace_update_support: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                           Options.bloom_locality: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                    Options.max_successive_merges: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.paranoid_file_checks: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.force_consistency_checks: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.report_bg_io_stats: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                               Options.ttl: 2592000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                       Options.enable_blob_files: false
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                           Options.min_blob_size: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                          Options.blob_file_size: 268435456
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.blob_file_starting_level: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:           Options.merge_operator: None
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:        Options.compaction_filter: None
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:        Options.compaction_filter_factory: None
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:  Options.sst_partitioner_factory: None
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55fb235a9680)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55fb227cf350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:        Options.write_buffer_size: 16777216
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:  Options.max_write_buffer_number: 64
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.compression: LZ4
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:       Options.prefix_extractor: nullptr
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:             Options.num_levels: 7
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                  Options.compression_opts.level: 32767
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:               Options.compression_opts.strategy: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                  Options.compression_opts.enabled: false
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                        Options.arena_block_size: 1048576
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.disable_auto_compactions: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                   Options.inplace_update_support: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                           Options.bloom_locality: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                    Options.max_successive_merges: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.paranoid_file_checks: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.force_consistency_checks: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.report_bg_io_stats: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                               Options.ttl: 2592000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                       Options.enable_blob_files: false
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                           Options.min_blob_size: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                          Options.blob_file_size: 268435456
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.blob_file_starting_level: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:           Options.merge_operator: None
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:        Options.compaction_filter: None
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:        Options.compaction_filter_factory: None
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:  Options.sst_partitioner_factory: None
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55fb235a9680)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55fb227cf350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:        Options.write_buffer_size: 16777216
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:  Options.max_write_buffer_number: 64
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.compression: LZ4
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:       Options.prefix_extractor: nullptr
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:             Options.num_levels: 7
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                  Options.compression_opts.level: 32767
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:               Options.compression_opts.strategy: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                  Options.compression_opts.enabled: false
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                        Options.arena_block_size: 1048576
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.disable_auto_compactions: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                   Options.inplace_update_support: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                           Options.bloom_locality: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                    Options.max_successive_merges: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.paranoid_file_checks: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.force_consistency_checks: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.report_bg_io_stats: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                               Options.ttl: 2592000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                       Options.enable_blob_files: false
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                           Options.min_blob_size: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                          Options.blob_file_size: 268435456
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.blob_file_starting_level: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:           Options.merge_operator: None
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:        Options.compaction_filter: None
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:        Options.compaction_filter_factory: None
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:  Options.sst_partitioner_factory: None
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55fb235a9ac0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55fb227ce9b0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:        Options.write_buffer_size: 16777216
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:  Options.max_write_buffer_number: 64
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.compression: LZ4
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:       Options.prefix_extractor: nullptr
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:             Options.num_levels: 7
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                  Options.compression_opts.level: 32767
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:               Options.compression_opts.strategy: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                  Options.compression_opts.enabled: false
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                        Options.arena_block_size: 1048576
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.disable_auto_compactions: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                   Options.inplace_update_support: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                           Options.bloom_locality: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                    Options.max_successive_merges: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.paranoid_file_checks: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.force_consistency_checks: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.report_bg_io_stats: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                               Options.ttl: 2592000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                       Options.enable_blob_files: false
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                           Options.min_blob_size: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                          Options.blob_file_size: 268435456
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.blob_file_starting_level: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:           Options.merge_operator: None
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:        Options.compaction_filter: None
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:        Options.compaction_filter_factory: None
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:  Options.sst_partitioner_factory: None
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55fb235a9ac0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55fb227ce9b0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:        Options.write_buffer_size: 16777216
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:  Options.max_write_buffer_number: 64
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.compression: LZ4
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:       Options.prefix_extractor: nullptr
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:             Options.num_levels: 7
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                  Options.compression_opts.level: 32767
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:               Options.compression_opts.strategy: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                  Options.compression_opts.enabled: false
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                        Options.arena_block_size: 1048576
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.disable_auto_compactions: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                   Options.inplace_update_support: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                           Options.bloom_locality: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                    Options.max_successive_merges: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.paranoid_file_checks: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.force_consistency_checks: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.report_bg_io_stats: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                               Options.ttl: 2592000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                       Options.enable_blob_files: false
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                           Options.min_blob_size: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                          Options.blob_file_size: 268435456
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.blob_file_starting_level: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:           Options.merge_operator: None
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:        Options.compaction_filter: None
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:        Options.compaction_filter_factory: None
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:  Options.sst_partitioner_factory: None
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55fb235a9ac0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55fb227ce9b0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:        Options.write_buffer_size: 16777216
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:  Options.max_write_buffer_number: 64
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.compression: LZ4
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:       Options.prefix_extractor: nullptr
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:             Options.num_levels: 7
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                  Options.compression_opts.level: 32767
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:               Options.compression_opts.strategy: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                  Options.compression_opts.enabled: false
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                        Options.arena_block_size: 1048576
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.disable_auto_compactions: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                   Options.inplace_update_support: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                           Options.bloom_locality: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                    Options.max_successive_merges: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.paranoid_file_checks: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.force_consistency_checks: 1
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.report_bg_io_stats: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                               Options.ttl: 2592000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                       Options.enable_blob_files: false
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                           Options.min_blob_size: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                          Options.blob_file_size: 268435456
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb:                Options.blob_file_starting_level: 0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: [db/column_family.cc:635] #011(skipping printing options)
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: [db/column_family.cc:635] #011(skipping printing options)
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: af557f43-9483-4a65-96a9-1d3a8a4b0b2d
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765013975892825, "job": 1, "event": "recovery_started", "wal_files": [31]}
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765013975906245, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 35, "file_size": 1272, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13, "largest_seqno": 21, "table_properties": {"data_size": 128, "index_size": 27, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 87, "raw_average_key_size": 17, "raw_value_size": 82, "raw_average_value_size": 16, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 2, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": ".T:int64_array.b:bitwise_xor", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765013975, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "af557f43-9483-4a65-96a9-1d3a8a4b0b2d", "db_session_id": "FMGV1GCT5BLWAHJBE976", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}}
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765013975909947, "cf_name": "p-0", "job": 1, "event": "table_file_creation", "file_number": 36, "file_size": 1595, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 14, "largest_seqno": 15, "table_properties": {"data_size": 469, "index_size": 39, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 72, "raw_average_key_size": 36, "raw_value_size": 571, "raw_average_value_size": 285, "num_data_blocks": 1, "num_entries": 2, "num_filter_entries": 2, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "p-0", "column_family_id": 4, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765013975, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "af557f43-9483-4a65-96a9-1d3a8a4b0b2d", "db_session_id": "FMGV1GCT5BLWAHJBE976", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}}
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765013975913041, "cf_name": "O-2", "job": 1, "event": "table_file_creation", "file_number": 37, "file_size": 1275, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 16, "largest_seqno": 16, "table_properties": {"data_size": 121, "index_size": 64, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 55, "raw_average_key_size": 55, "raw_value_size": 50, "raw_average_value_size": 50, "num_data_blocks": 1, "num_entries": 1, "num_filter_entries": 1, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "O-2", "column_family_id": 9, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765013975, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "af557f43-9483-4a65-96a9-1d3a8a4b0b2d", "db_session_id": "FMGV1GCT5BLWAHJBE976", "orig_file_number": 37, "seqno_to_time_mapping": "N/A"}}
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765013975915119, "job": 1, "event": "recovery_finished"}
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: [db/version_set.cc:5047] Creating manifest 40
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x55fb23782000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: DB pointer 0x55fb23750000
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _upgrade_super from 4, latest 4
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _upgrade_super done
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 0.1 total, 0.1 interval#012Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s#012Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.01              0.00         1    0.013       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.01              0.00         1    0.013       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.01              0.00         1    0.013       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.1      0.01              0.00         1    0.013       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.1 total, 0.1 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.01 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.01 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55fb227cf350#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.2e-05 secs_since: 0#012Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.1 total, 0.1 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55fb227cf350#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.2e-05 secs_since: 0#012Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.1 total, 0.1 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55fb227cf350#2 capacity: 460.80 MB usag
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: <cls> /home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos9/DIST/centos9/MACHINE_SIZE/gigantic/release/19.2.3/rpm/el9/BUILD/ceph-19.2.3/src/cls/cephfs/cls_cephfs.cc:201: loading cephfs
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: <cls> /home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos9/DIST/centos9/MACHINE_SIZE/gigantic/release/19.2.3/rpm/el9/BUILD/ceph-19.2.3/src/cls/hello/cls_hello.cc:316: loading cls_hello
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: _get_class not permitted to load lua
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: _get_class not permitted to load sdk
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: osd.0 0 crush map has features 288232575208783872, adjusting msgr requires for clients
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: osd.0 0 crush map has features 288232575208783872 was 8705, adjusting msgr requires for mons
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: osd.0 0 crush map has features 288232575208783872, adjusting msgr requires for osds
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: osd.0 0 check_osdmap_features enabling on-disk ERASURE CODES compat feature
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: osd.0 0 load_pgs
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: osd.0 0 load_pgs opened 0 pgs
Dec  6 04:39:35 np0005548916 ceph-osd[77465]: osd.0 0 log_to_monitors true
Dec  6 04:39:35 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-osd-0[77461]: 2025-12-06T09:39:35.971+0000 7fe795721740 -1 osd.0 0 log_to_monitors true
Dec  6 04:39:37 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : purged_snaps scrub starts
Dec  6 04:39:37 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : purged_snaps scrub ok
Dec  6 04:39:37 np0005548916 ceph-osd[77465]: osd.0 0 done with init, starting boot process
Dec  6 04:39:37 np0005548916 ceph-osd[77465]: osd.0 0 start_boot
Dec  6 04:39:37 np0005548916 ceph-osd[77465]: osd.0 0 maybe_override_options_for_qos osd_max_backfills set to 1
Dec  6 04:39:37 np0005548916 ceph-osd[77465]: osd.0 0 maybe_override_options_for_qos osd_recovery_max_active set to 0
Dec  6 04:39:37 np0005548916 ceph-osd[77465]: osd.0 0 maybe_override_options_for_qos osd_recovery_max_active_hdd set to 3
Dec  6 04:39:37 np0005548916 ceph-osd[77465]: osd.0 0 maybe_override_options_for_qos osd_recovery_max_active_ssd set to 10
Dec  6 04:39:37 np0005548916 ceph-osd[77465]: osd.0 0  bench count 12288000 bsize 4 KiB
Dec  6 04:39:39 np0005548916 podman[78299]: 2025-12-06 09:39:39.031501408 +0000 UTC m=+0.157191203 container exec 500f8c89b5c281d0ace139835b07ea5bc6259ce3e028d8284d57da97424b55e0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-crash-compute-1, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True)
Dec  6 04:39:39 np0005548916 podman[78299]: 2025-12-06 09:39:39.142619231 +0000 UTC m=+0.268309006 container exec_died 500f8c89b5c281d0ace139835b07ea5bc6259ce3e028d8284d57da97424b55e0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-crash-compute-1, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325)
Dec  6 04:39:40 np0005548916 podman[78438]: 2025-12-06 09:39:39.996092002 +0000 UTC m=+0.042356762 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  6 04:39:40 np0005548916 podman[78438]: 2025-12-06 09:39:40.101692155 +0000 UTC m=+0.147956905 container create aee8c6e29802d7f938eca68cd018362d6e0ea303b8d42e7acf66f57a6f5b663e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=silly_mendeleev, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, CEPH_REF=squid, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  6 04:39:40 np0005548916 systemd[1]: Started libpod-conmon-aee8c6e29802d7f938eca68cd018362d6e0ea303b8d42e7acf66f57a6f5b663e.scope.
Dec  6 04:39:40 np0005548916 systemd[1]: Started libcrun container.
Dec  6 04:39:40 np0005548916 podman[78438]: 2025-12-06 09:39:40.349778403 +0000 UTC m=+0.396043183 container init aee8c6e29802d7f938eca68cd018362d6e0ea303b8d42e7acf66f57a6f5b663e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=silly_mendeleev, org.label-schema.license=GPLv2, ceph=True, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_REF=squid, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Dec  6 04:39:40 np0005548916 podman[78438]: 2025-12-06 09:39:40.358050648 +0000 UTC m=+0.404315388 container start aee8c6e29802d7f938eca68cd018362d6e0ea303b8d42e7acf66f57a6f5b663e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=silly_mendeleev, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.build-date=20250325, CEPH_REF=squid)
Dec  6 04:39:40 np0005548916 silly_mendeleev[78455]: 167 167
Dec  6 04:39:40 np0005548916 systemd[1]: libpod-aee8c6e29802d7f938eca68cd018362d6e0ea303b8d42e7acf66f57a6f5b663e.scope: Deactivated successfully.
Dec  6 04:39:40 np0005548916 podman[78438]: 2025-12-06 09:39:40.399458007 +0000 UTC m=+0.445722747 container attach aee8c6e29802d7f938eca68cd018362d6e0ea303b8d42e7acf66f57a6f5b663e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=silly_mendeleev, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, io.buildah.version=1.40.1)
Dec  6 04:39:40 np0005548916 podman[78438]: 2025-12-06 09:39:40.400765942 +0000 UTC m=+0.447030682 container died aee8c6e29802d7f938eca68cd018362d6e0ea303b8d42e7acf66f57a6f5b663e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=silly_mendeleev, ceph=True, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1)
Dec  6 04:39:40 np0005548916 systemd[1]: var-lib-containers-storage-overlay-6111ee47524b5b6dc41ae8018fa81ea753c7e0a5fff66832be6e2713d62aaa7b-merged.mount: Deactivated successfully.
Dec  6 04:39:40 np0005548916 podman[78438]: 2025-12-06 09:39:40.662615585 +0000 UTC m=+0.708880365 container remove aee8c6e29802d7f938eca68cd018362d6e0ea303b8d42e7acf66f57a6f5b663e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=silly_mendeleev, ceph=True, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  6 04:39:40 np0005548916 systemd[1]: libpod-conmon-aee8c6e29802d7f938eca68cd018362d6e0ea303b8d42e7acf66f57a6f5b663e.scope: Deactivated successfully.
Dec  6 04:39:40 np0005548916 podman[78480]: 2025-12-06 09:39:40.902977566 +0000 UTC m=+0.083243142 container create 4286e59e578a317748ac04b2acd4f6e373f521e1a054fe2bec7b59049a618c51 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=kind_rosalind, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid)
Dec  6 04:39:40 np0005548916 podman[78480]: 2025-12-06 09:39:40.864671105 +0000 UTC m=+0.044936661 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  6 04:39:41 np0005548916 systemd[1]: Started libpod-conmon-4286e59e578a317748ac04b2acd4f6e373f521e1a054fe2bec7b59049a618c51.scope.
Dec  6 04:39:41 np0005548916 systemd[1]: Started libcrun container.
Dec  6 04:39:41 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4464c5e5686d44a2ee9eeb629d19da23d71a23cdb9011766c7475be6581075d6/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  6 04:39:41 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4464c5e5686d44a2ee9eeb629d19da23d71a23cdb9011766c7475be6581075d6/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  6 04:39:41 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4464c5e5686d44a2ee9eeb629d19da23d71a23cdb9011766c7475be6581075d6/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  6 04:39:41 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4464c5e5686d44a2ee9eeb629d19da23d71a23cdb9011766c7475be6581075d6/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  6 04:39:41 np0005548916 podman[78480]: 2025-12-06 09:39:41.355103873 +0000 UTC m=+0.535369429 container init 4286e59e578a317748ac04b2acd4f6e373f521e1a054fe2bec7b59049a618c51 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=kind_rosalind, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  6 04:39:41 np0005548916 podman[78480]: 2025-12-06 09:39:41.362583251 +0000 UTC m=+0.542848787 container start 4286e59e578a317748ac04b2acd4f6e373f521e1a054fe2bec7b59049a618c51 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=kind_rosalind, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, ceph=True, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec  6 04:39:41 np0005548916 podman[78480]: 2025-12-06 09:39:41.439760033 +0000 UTC m=+0.620025569 container attach 4286e59e578a317748ac04b2acd4f6e373f521e1a054fe2bec7b59049a618c51 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=kind_rosalind, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  6 04:39:42 np0005548916 kind_rosalind[78496]: [
Dec  6 04:39:42 np0005548916 kind_rosalind[78496]:    {
Dec  6 04:39:42 np0005548916 kind_rosalind[78496]:        "available": false,
Dec  6 04:39:42 np0005548916 kind_rosalind[78496]:        "being_replaced": false,
Dec  6 04:39:42 np0005548916 kind_rosalind[78496]:        "ceph_device_lvm": false,
Dec  6 04:39:42 np0005548916 kind_rosalind[78496]:        "device_id": "QEMU_DVD-ROM_QM00001",
Dec  6 04:39:42 np0005548916 kind_rosalind[78496]:        "lsm_data": {},
Dec  6 04:39:42 np0005548916 kind_rosalind[78496]:        "lvs": [],
Dec  6 04:39:42 np0005548916 kind_rosalind[78496]:        "path": "/dev/sr0",
Dec  6 04:39:42 np0005548916 kind_rosalind[78496]:        "rejected_reasons": [
Dec  6 04:39:42 np0005548916 kind_rosalind[78496]:            "Has a FileSystem",
Dec  6 04:39:42 np0005548916 kind_rosalind[78496]:            "Insufficient space (<5GB)"
Dec  6 04:39:42 np0005548916 kind_rosalind[78496]:        ],
Dec  6 04:39:42 np0005548916 kind_rosalind[78496]:        "sys_api": {
Dec  6 04:39:42 np0005548916 kind_rosalind[78496]:            "actuators": null,
Dec  6 04:39:42 np0005548916 kind_rosalind[78496]:            "device_nodes": [
Dec  6 04:39:42 np0005548916 kind_rosalind[78496]:                "sr0"
Dec  6 04:39:42 np0005548916 kind_rosalind[78496]:            ],
Dec  6 04:39:42 np0005548916 kind_rosalind[78496]:            "devname": "sr0",
Dec  6 04:39:42 np0005548916 kind_rosalind[78496]:            "human_readable_size": "482.00 KB",
Dec  6 04:39:42 np0005548916 kind_rosalind[78496]:            "id_bus": "ata",
Dec  6 04:39:42 np0005548916 kind_rosalind[78496]:            "model": "QEMU DVD-ROM",
Dec  6 04:39:42 np0005548916 kind_rosalind[78496]:            "nr_requests": "2",
Dec  6 04:39:42 np0005548916 kind_rosalind[78496]:            "parent": "/dev/sr0",
Dec  6 04:39:42 np0005548916 kind_rosalind[78496]:            "partitions": {},
Dec  6 04:39:42 np0005548916 kind_rosalind[78496]:            "path": "/dev/sr0",
Dec  6 04:39:42 np0005548916 kind_rosalind[78496]:            "removable": "1",
Dec  6 04:39:42 np0005548916 kind_rosalind[78496]:            "rev": "2.5+",
Dec  6 04:39:42 np0005548916 kind_rosalind[78496]:            "ro": "0",
Dec  6 04:39:42 np0005548916 kind_rosalind[78496]:            "rotational": "1",
Dec  6 04:39:42 np0005548916 kind_rosalind[78496]:            "sas_address": "",
Dec  6 04:39:42 np0005548916 kind_rosalind[78496]:            "sas_device_handle": "",
Dec  6 04:39:42 np0005548916 kind_rosalind[78496]:            "scheduler_mode": "mq-deadline",
Dec  6 04:39:42 np0005548916 kind_rosalind[78496]:            "sectors": 0,
Dec  6 04:39:42 np0005548916 kind_rosalind[78496]:            "sectorsize": "2048",
Dec  6 04:39:42 np0005548916 kind_rosalind[78496]:            "size": 493568.0,
Dec  6 04:39:42 np0005548916 kind_rosalind[78496]:            "support_discard": "2048",
Dec  6 04:39:42 np0005548916 kind_rosalind[78496]:            "type": "disk",
Dec  6 04:39:42 np0005548916 kind_rosalind[78496]:            "vendor": "QEMU"
Dec  6 04:39:42 np0005548916 kind_rosalind[78496]:        }
Dec  6 04:39:42 np0005548916 kind_rosalind[78496]:    }
Dec  6 04:39:42 np0005548916 kind_rosalind[78496]: ]
Dec  6 04:39:42 np0005548916 systemd[1]: libpod-4286e59e578a317748ac04b2acd4f6e373f521e1a054fe2bec7b59049a618c51.scope: Deactivated successfully.
Dec  6 04:39:42 np0005548916 podman[78480]: 2025-12-06 09:39:42.099678628 +0000 UTC m=+1.279944194 container died 4286e59e578a317748ac04b2acd4f6e373f521e1a054fe2bec7b59049a618c51 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=kind_rosalind, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  6 04:39:44 np0005548916 systemd[1]: var-lib-containers-storage-overlay-4464c5e5686d44a2ee9eeb629d19da23d71a23cdb9011766c7475be6581075d6-merged.mount: Deactivated successfully.
Dec  6 04:39:44 np0005548916 podman[78480]: 2025-12-06 09:39:44.89426193 +0000 UTC m=+4.074527466 container remove 4286e59e578a317748ac04b2acd4f6e373f521e1a054fe2bec7b59049a618c51 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=kind_rosalind, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.build-date=20250325, OSD_FLAVOR=default)
Dec  6 04:39:44 np0005548916 systemd[1]: libpod-conmon-4286e59e578a317748ac04b2acd4f6e373f521e1a054fe2bec7b59049a618c51.scope: Deactivated successfully.
Dec  6 04:39:47 np0005548916 ceph-osd[77465]: osd.0 0 maybe_override_max_osd_capacity_for_qos osd bench result - bandwidth (MiB/sec): 22.135 iops: 5666.545 elapsed_sec: 0.529
Dec  6 04:39:47 np0005548916 ceph-osd[77465]: log_channel(cluster) log [WRN] : OSD bench result of 5666.545158 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.0. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Dec  6 04:39:47 np0005548916 ceph-osd[77465]: osd.0 0 waiting for initial osdmap
Dec  6 04:39:47 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-osd-0[77461]: 2025-12-06T09:39:47.884+0000 7fe7916a4640 -1 osd.0 0 waiting for initial osdmap
Dec  6 04:39:47 np0005548916 ceph-osd[77465]: osd.0 11 crush map has features 288514051259236352, adjusting msgr requires for clients
Dec  6 04:39:47 np0005548916 ceph-osd[77465]: osd.0 11 crush map has features 288514051259236352 was 288232575208792577, adjusting msgr requires for mons
Dec  6 04:39:47 np0005548916 ceph-osd[77465]: osd.0 11 crush map has features 3314933000852226048, adjusting msgr requires for osds
Dec  6 04:39:47 np0005548916 ceph-osd[77465]: osd.0 11 check_osdmap_features require_osd_release unknown -> squid
Dec  6 04:39:47 np0005548916 ceph-osd[77465]: osd.0 11 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Dec  6 04:39:47 np0005548916 ceph-osd[77465]: osd.0 11 set_numa_affinity not setting numa affinity
Dec  6 04:39:47 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-osd-0[77461]: 2025-12-06T09:39:47.913+0000 7fe78cccc640 -1 osd.0 11 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Dec  6 04:39:47 np0005548916 ceph-osd[77465]: osd.0 11 _collect_metadata loop3:  no unique device id for loop3: fallback method has no model nor serial no unique device path for loop3: no symlink to loop3 in /dev/disk/by-path
Dec  6 04:39:48 np0005548916 ceph-osd[77465]: osd.0 12 state: booting -> active
Dec  6 04:39:52 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 15 pg[3.0( empty local-lis/les=0/0 n=0 ec=15/15 lis/c=0/0 les/c/f=0/0/0 sis=15) [0] r=0 lpr=15 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:39:53 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 16 pg[3.0( empty local-lis/les=15/16 n=0 ec=15/15 lis/c=0/0 les/c/f=0/0/0 sis=15) [0] r=0 lpr=15 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:39:54 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 17 pg[4.0( empty local-lis/les=0/0 n=0 ec=17/17 lis/c=0/0 les/c/f=0/0/0 sis=17) [0] r=0 lpr=17 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:39:58 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 18 pg[4.0( empty local-lis/les=17/18 n=0 ec=17/17 lis/c=0/0 les/c/f=0/0/0 sis=17) [0] r=0 lpr=17 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:40:01 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 19 pg[5.0( empty local-lis/les=0/0 n=0 ec=19/19 lis/c=0/0 les/c/f=0/0/0 sis=19) [0] r=0 lpr=19 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:40:01 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 20 pg[5.0( empty local-lis/les=19/20 n=0 ec=19/19 lis/c=0/0 les/c/f=0/0/0 sis=19) [0] r=0 lpr=19 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:40:04 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 21 pg[6.0( empty local-lis/les=0/0 n=0 ec=21/21 lis/c=0/0 les/c/f=0/0/0 sis=21) [0] r=0 lpr=21 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:40:05 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 23 pg[6.0( empty local-lis/les=21/23 n=0 ec=21/21 lis/c=0/0 les/c/f=0/0/0 sis=21) [0] r=0 lpr=21 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:40:06 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 24 pg[3.0( empty local-lis/les=15/16 n=0 ec=15/15 lis/c=15/15 les/c/f=16/16/0 sis=24 pruub=11.071418762s) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 active pruub 41.233177185s@ mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [0], acting_primary 0 -> 0, up_primary 0 -> 0, role 0 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:40:06 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 24 pg[3.0( empty local-lis/les=15/16 n=0 ec=15/15 lis/c=15/15 les/c/f=16/16/0 sis=24 pruub=11.071418762s) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 unknown pruub 41.233177185s@ mbc={}] state<Start>: transitioning to Primary
Dec  6 04:40:07 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.18( empty local-lis/les=15/16 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:40:07 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.19( empty local-lis/les=15/16 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:40:07 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.17( empty local-lis/les=15/16 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:40:07 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.16( empty local-lis/les=15/16 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:40:07 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.15( empty local-lis/les=15/16 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:40:07 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.14( empty local-lis/les=15/16 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:40:07 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.13( empty local-lis/les=15/16 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:40:07 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.12( empty local-lis/les=15/16 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:40:07 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.11( empty local-lis/les=15/16 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:40:07 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.10( empty local-lis/les=15/16 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:40:07 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.f( empty local-lis/les=15/16 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:40:07 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.e( empty local-lis/les=15/16 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:40:07 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.d( empty local-lis/les=15/16 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:40:07 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.c( empty local-lis/les=15/16 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:40:07 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.b( empty local-lis/les=15/16 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:40:07 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.a( empty local-lis/les=15/16 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:40:07 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.7( empty local-lis/les=15/16 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:40:07 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.6( empty local-lis/les=15/16 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:40:07 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[4.0( empty local-lis/les=17/18 n=0 ec=17/17 lis/c=17/17 les/c/f=18/18/0 sis=25 pruub=14.984453201s) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 active pruub 46.179439545s@ mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [0], acting_primary 0 -> 0, up_primary 0 -> 0, role 0 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:40:07 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.5( empty local-lis/les=15/16 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:40:07 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.2( empty local-lis/les=15/16 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:40:07 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.3( empty local-lis/les=15/16 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:40:07 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.4( empty local-lis/les=15/16 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:40:07 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.8( empty local-lis/les=15/16 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:40:07 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.1a( empty local-lis/les=15/16 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:40:07 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.9( empty local-lis/les=15/16 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:40:07 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.1b( empty local-lis/les=15/16 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:40:07 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.1c( empty local-lis/les=15/16 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:40:07 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.1d( empty local-lis/les=15/16 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:40:07 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.1( empty local-lis/les=15/16 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:40:07 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.1e( empty local-lis/les=15/16 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:40:07 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.18( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:40:07 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.1f( empty local-lis/les=15/16 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:40:07 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[4.0( empty local-lis/les=17/18 n=0 ec=17/17 lis/c=17/17 les/c/f=18/18/0 sis=25 pruub=14.984453201s) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 unknown pruub 46.179439545s@ mbc={}] state<Start>: transitioning to Primary
Dec  6 04:40:07 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.19( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:40:07 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.15( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:40:07 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.16( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:40:07 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.17( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:40:07 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.14( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:40:07 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.12( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:40:07 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.11( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:40:07 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.13( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:40:07 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.10( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:40:07 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.f( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:40:07 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.d( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:40:07 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.e( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:40:07 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.c( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:40:07 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.b( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:40:07 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.a( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:40:07 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.7( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:40:07 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.0( empty local-lis/les=24/25 n=0 ec=15/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:40:07 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.2( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:40:07 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.6( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:40:07 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.4( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:40:07 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.5( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:40:07 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.8( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:40:07 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.3( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:40:07 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.1a( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:40:07 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.1c( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:40:07 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.9( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:40:07 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.1b( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:40:07 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.1e( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:40:07 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.1( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:40:07 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.1f( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:40:07 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.1d( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:40:08 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 3.18 scrub starts
Dec  6 04:40:08 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 3.18 scrub ok
Dec  6 04:40:08 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.1f( empty local-lis/les=17/18 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:40:08 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.1e( empty local-lis/les=17/18 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:40:08 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.10( empty local-lis/les=17/18 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:40:08 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.11( empty local-lis/les=17/18 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:40:08 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.12( empty local-lis/les=17/18 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:40:08 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.13( empty local-lis/les=17/18 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:40:08 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.15( empty local-lis/les=17/18 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:40:08 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.14( empty local-lis/les=17/18 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:40:08 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.16( empty local-lis/les=17/18 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:40:08 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.17( empty local-lis/les=17/18 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:40:08 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.8( empty local-lis/les=17/18 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:40:08 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.9( empty local-lis/les=17/18 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:40:08 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.b( empty local-lis/les=17/18 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:40:08 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.a( empty local-lis/les=17/18 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:40:08 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.d( empty local-lis/les=17/18 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:40:08 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.7( empty local-lis/les=17/18 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:40:08 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.1( empty local-lis/les=17/18 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:40:08 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.2( empty local-lis/les=17/18 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:40:08 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.6( empty local-lis/les=17/18 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:40:08 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.5( empty local-lis/les=17/18 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:40:08 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.c( empty local-lis/les=17/18 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:40:08 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.4( empty local-lis/les=17/18 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:40:08 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.3( empty local-lis/les=17/18 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:40:08 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.f( empty local-lis/les=17/18 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:40:08 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.e( empty local-lis/les=17/18 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:40:08 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.1d( empty local-lis/les=17/18 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:40:08 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.1c( empty local-lis/les=17/18 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:40:08 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.1b( empty local-lis/les=17/18 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:40:08 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.1a( empty local-lis/les=17/18 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:40:08 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.19( empty local-lis/les=17/18 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:40:08 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.18( empty local-lis/les=17/18 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:40:08 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.1f( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:40:08 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.10( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:40:08 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.1e( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:40:08 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.11( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:40:08 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.13( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:40:08 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.12( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:40:08 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.14( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:40:08 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.16( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:40:08 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.17( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:40:08 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.8( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:40:08 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.9( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:40:08 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.15( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:40:08 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.b( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:40:08 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.a( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:40:08 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.0( empty local-lis/les=25/26 n=0 ec=17/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:40:08 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.7( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:40:08 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.d( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:40:08 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.2( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:40:08 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.1( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:40:08 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.6( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:40:08 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.5( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:40:08 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.4( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:40:08 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.c( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:40:08 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.3( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:40:08 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.e( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:40:08 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.f( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:40:08 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.1d( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:40:08 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.1a( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:40:08 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.1b( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:40:08 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.19( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:40:08 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.1c( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:40:08 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.18( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:40:09 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 3.19 scrub starts
Dec  6 04:40:09 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 3.19 scrub ok
Dec  6 04:40:09 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 27 pg[5.0( empty local-lis/les=19/20 n=0 ec=19/19 lis/c=19/19 les/c/f=20/20/0 sis=27 pruub=15.629227638s) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 active pruub 49.513397217s@ mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [0], acting_primary 0 -> 0, up_primary 0 -> 0, role 0 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:40:09 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 27 pg[5.0( empty local-lis/les=19/20 n=0 ec=19/19 lis/c=19/19 les/c/f=20/20/0 sis=27 pruub=15.629227638s) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 unknown pruub 49.513397217s@ mbc={}] state<Start>: transitioning to Primary
Dec  6 04:40:10 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 3.16 scrub starts
Dec  6 04:40:10 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 3.16 scrub ok
Dec  6 04:40:11 np0005548916 podman[79539]: 2025-12-06 09:40:11.004294742 +0000 UTC m=+0.062250882 container create f3e525ea47a5fc41abdec328bcd9b8261a3d335a3e3b4a068a5235b38541b524 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=confident_bhaskara, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  6 04:40:11 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 3.14 scrub starts
Dec  6 04:40:11 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 3.14 scrub ok
Dec  6 04:40:11 np0005548916 systemd[1]: Started libpod-conmon-f3e525ea47a5fc41abdec328bcd9b8261a3d335a3e3b4a068a5235b38541b524.scope.
Dec  6 04:40:11 np0005548916 podman[79539]: 2025-12-06 09:40:10.972917471 +0000 UTC m=+0.030873631 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  6 04:40:11 np0005548916 systemd[1]: Started libcrun container.
Dec  6 04:40:11 np0005548916 podman[79539]: 2025-12-06 09:40:11.104773265 +0000 UTC m=+0.162729425 container init f3e525ea47a5fc41abdec328bcd9b8261a3d335a3e3b4a068a5235b38541b524 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=confident_bhaskara, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  6 04:40:11 np0005548916 podman[79539]: 2025-12-06 09:40:11.114458584 +0000 UTC m=+0.172414694 container start f3e525ea47a5fc41abdec328bcd9b8261a3d335a3e3b4a068a5235b38541b524 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=confident_bhaskara, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.schema-version=1.0)
Dec  6 04:40:11 np0005548916 podman[79539]: 2025-12-06 09:40:11.118549223 +0000 UTC m=+0.176505333 container attach f3e525ea47a5fc41abdec328bcd9b8261a3d335a3e3b4a068a5235b38541b524 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=confident_bhaskara, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2)
Dec  6 04:40:11 np0005548916 confident_bhaskara[79556]: 167 167
Dec  6 04:40:11 np0005548916 systemd[1]: libpod-f3e525ea47a5fc41abdec328bcd9b8261a3d335a3e3b4a068a5235b38541b524.scope: Deactivated successfully.
Dec  6 04:40:11 np0005548916 podman[79539]: 2025-12-06 09:40:11.123368377 +0000 UTC m=+0.181324497 container died f3e525ea47a5fc41abdec328bcd9b8261a3d335a3e3b4a068a5235b38541b524 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=confident_bhaskara, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, CEPH_REF=squid)
Dec  6 04:40:11 np0005548916 systemd[1]: var-lib-containers-storage-overlay-d3e3b58292f6affe50334b4deb300aee298730cf5f38e1a1c0b91f0bc74f7a3a-merged.mount: Deactivated successfully.
Dec  6 04:40:11 np0005548916 podman[79539]: 2025-12-06 09:40:11.173087274 +0000 UTC m=+0.231043384 container remove f3e525ea47a5fc41abdec328bcd9b8261a3d335a3e3b4a068a5235b38541b524 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=confident_bhaskara, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  6 04:40:11 np0005548916 systemd[1]: libpod-conmon-f3e525ea47a5fc41abdec328bcd9b8261a3d335a3e3b4a068a5235b38541b524.scope: Deactivated successfully.
Dec  6 04:40:11 np0005548916 podman[79572]: 2025-12-06 09:40:11.244641234 +0000 UTC m=+0.048435882 container create 15545a7f3b9bbdb79c6808e090321368094bcfb317db18faf592947982bf2095 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=elated_cori, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, ceph=True)
Dec  6 04:40:11 np0005548916 systemd[1]: Started libpod-conmon-15545a7f3b9bbdb79c6808e090321368094bcfb317db18faf592947982bf2095.scope.
Dec  6 04:40:11 np0005548916 systemd[1]: Started libcrun container.
Dec  6 04:40:11 np0005548916 podman[79572]: 2025-12-06 09:40:11.22384255 +0000 UTC m=+0.027637228 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  6 04:40:11 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5b493a9f615d2d02d11862f0cbeb950572f286e4c682b8067bcd8dc6d141f162/merged/tmp/keyring supports timestamps until 2038 (0x7fffffff)
Dec  6 04:40:11 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5b493a9f615d2d02d11862f0cbeb950572f286e4c682b8067bcd8dc6d141f162/merged/tmp/config supports timestamps until 2038 (0x7fffffff)
Dec  6 04:40:11 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5b493a9f615d2d02d11862f0cbeb950572f286e4c682b8067bcd8dc6d141f162/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  6 04:40:11 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5b493a9f615d2d02d11862f0cbeb950572f286e4c682b8067bcd8dc6d141f162/merged/var/lib/ceph/mon/ceph-compute-1 supports timestamps until 2038 (0x7fffffff)
Dec  6 04:40:11 np0005548916 podman[79572]: 2025-12-06 09:40:11.338856626 +0000 UTC m=+0.142651274 container init 15545a7f3b9bbdb79c6808e090321368094bcfb317db18faf592947982bf2095 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=elated_cori, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, CEPH_REF=squid, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1)
Dec  6 04:40:11 np0005548916 podman[79572]: 2025-12-06 09:40:11.345698909 +0000 UTC m=+0.149493557 container start 15545a7f3b9bbdb79c6808e090321368094bcfb317db18faf592947982bf2095 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=elated_cori, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid)
Dec  6 04:40:11 np0005548916 podman[79572]: 2025-12-06 09:40:11.349573595 +0000 UTC m=+0.153368273 container attach 15545a7f3b9bbdb79c6808e090321368094bcfb317db18faf592947982bf2095 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=elated_cori, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325)
Dec  6 04:40:11 np0005548916 systemd[1]: libpod-15545a7f3b9bbdb79c6808e090321368094bcfb317db18faf592947982bf2095.scope: Deactivated successfully.
Dec  6 04:40:11 np0005548916 conmon[79588]: conmon 15545a7f3b9bbdb79c68 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-15545a7f3b9bbdb79c6808e090321368094bcfb317db18faf592947982bf2095.scope/container/memory.events
Dec  6 04:40:11 np0005548916 podman[79614]: 2025-12-06 09:40:11.476486912 +0000 UTC m=+0.025013677 container died 15545a7f3b9bbdb79c6808e090321368094bcfb317db18faf592947982bf2095 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=elated_cori, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, CEPH_REF=squid, org.label-schema.vendor=CentOS)
Dec  6 04:40:11 np0005548916 systemd[1]: var-lib-containers-storage-overlay-5b493a9f615d2d02d11862f0cbeb950572f286e4c682b8067bcd8dc6d141f162-merged.mount: Deactivated successfully.
Dec  6 04:40:11 np0005548916 podman[79614]: 2025-12-06 09:40:11.510424552 +0000 UTC m=+0.058951307 container remove 15545a7f3b9bbdb79c6808e090321368094bcfb317db18faf592947982bf2095 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=elated_cori, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, ceph=True, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec  6 04:40:11 np0005548916 systemd[1]: libpod-conmon-15545a7f3b9bbdb79c6808e090321368094bcfb317db18faf592947982bf2095.scope: Deactivated successfully.
Dec  6 04:40:11 np0005548916 systemd[1]: Reloading.
Dec  6 04:40:11 np0005548916 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:40:11 np0005548916 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:40:11 np0005548916 systemd[1]: Reloading.
Dec  6 04:40:12 np0005548916 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:40:12 np0005548916 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:40:12 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 3.15 scrub starts
Dec  6 04:40:12 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 3.15 scrub ok
Dec  6 04:40:12 np0005548916 systemd[1]: Starting Ceph mon.compute-1 for 5ecd3f74-dade-5fc4-92ce-8950ae424258...
Dec  6 04:40:12 np0005548916 podman[79751]: 2025-12-06 09:40:12.489001741 +0000 UTC m=+0.055319697 container create d320de814b2790a418a9af21e3ae56e9af7093005540777d0244e1c42ff347ad (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mon-compute-1, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  6 04:40:12 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d326c8a7dc075b23ae3538f3eed9f441140fd37896ca58980e1d734a506dc82a/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  6 04:40:12 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d326c8a7dc075b23ae3538f3eed9f441140fd37896ca58980e1d734a506dc82a/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  6 04:40:12 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d326c8a7dc075b23ae3538f3eed9f441140fd37896ca58980e1d734a506dc82a/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  6 04:40:12 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d326c8a7dc075b23ae3538f3eed9f441140fd37896ca58980e1d734a506dc82a/merged/var/lib/ceph/mon/ceph-compute-1 supports timestamps until 2038 (0x7fffffff)
Dec  6 04:40:12 np0005548916 podman[79751]: 2025-12-06 09:40:12.552978005 +0000 UTC m=+0.119295981 container init d320de814b2790a418a9af21e3ae56e9af7093005540777d0244e1c42ff347ad (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mon-compute-1, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  6 04:40:12 np0005548916 podman[79751]: 2025-12-06 09:40:12.465356931 +0000 UTC m=+0.031674937 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  6 04:40:12 np0005548916 podman[79751]: 2025-12-06 09:40:12.562031941 +0000 UTC m=+0.128349897 container start d320de814b2790a418a9af21e3ae56e9af7093005540777d0244e1c42ff347ad (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mon-compute-1, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  6 04:40:12 np0005548916 bash[79751]: d320de814b2790a418a9af21e3ae56e9af7093005540777d0244e1c42ff347ad
Dec  6 04:40:12 np0005548916 systemd[1]: Started Ceph mon.compute-1 for 5ecd3f74-dade-5fc4-92ce-8950ae424258.
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: set uid:gid to 167:167 (ceph:ceph)
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mon, pid 2
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: pidfile_write: ignore empty --pid-file
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: load: jerasure load: lrc 
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb: RocksDB version: 7.9.2
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb: Git sha 0
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb: Compile date 2025-07-17 03:12:14
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb: DB SUMMARY
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb: DB Session ID:  1TK25AVRA1WQDS4JHM8T
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb: CURRENT file:  CURRENT
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb: IDENTITY file:  IDENTITY
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb: MANIFEST file:  MANIFEST-000005 size: 59 Bytes
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb: SST files in /var/lib/ceph/mon/ceph-compute-1/store.db dir, Total Num: 0, files: 
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb: Write Ahead Log file in /var/lib/ceph/mon/ceph-compute-1/store.db: 000004.log size: 511 ; 
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:                         Options.error_if_exists: 0
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:                       Options.create_if_missing: 0
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:                         Options.paranoid_checks: 1
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:             Options.flush_verify_memtable_count: 1
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:                                     Options.env: 0x55fbbd682c20
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:                                      Options.fs: PosixFileSystem
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:                                Options.info_log: 0x55fbbecdba20
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:                Options.max_file_opening_threads: 16
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:                              Options.statistics: (nil)
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:                               Options.use_fsync: 0
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:                       Options.max_log_file_size: 0
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:                   Options.log_file_time_to_roll: 0
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:                       Options.keep_log_file_num: 1000
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:                    Options.recycle_log_file_num: 0
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:                         Options.allow_fallocate: 1
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:                        Options.allow_mmap_reads: 0
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:                       Options.allow_mmap_writes: 0
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:                        Options.use_direct_reads: 0
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:          Options.create_missing_column_families: 0
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:                              Options.db_log_dir: 
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:                                 Options.wal_dir: 
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:                Options.table_cache_numshardbits: 6
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:                         Options.WAL_ttl_seconds: 0
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:                       Options.WAL_size_limit_MB: 0
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:             Options.manifest_preallocation_size: 4194304
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:                     Options.is_fd_close_on_exec: 1
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:                   Options.advise_random_on_open: 1
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:                    Options.db_write_buffer_size: 0
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:                    Options.write_buffer_manager: 0x55fbbecdf900
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:         Options.access_hint_on_compaction_start: 1
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:                      Options.use_adaptive_mutex: 0
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:                            Options.rate_limiter: (nil)
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:                       Options.wal_recovery_mode: 2
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:                  Options.enable_thread_tracking: 0
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:                  Options.enable_pipelined_write: 0
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:                  Options.unordered_write: 0
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:             Options.write_thread_max_yield_usec: 100
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:                               Options.row_cache: None
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:                              Options.wal_filter: None
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:             Options.avoid_flush_during_recovery: 0
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:             Options.allow_ingest_behind: 0
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:             Options.two_write_queues: 0
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:             Options.manual_wal_flush: 0
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:             Options.wal_compression: 0
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:             Options.atomic_flush: 0
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:                 Options.persist_stats_to_disk: 0
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:                 Options.write_dbid_to_manifest: 0
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:                 Options.log_readahead_size: 0
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:                 Options.best_efforts_recovery: 0
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:             Options.allow_data_in_errors: 0
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:             Options.db_host_id: __hostname__
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:             Options.enforce_single_del_contracts: true
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:             Options.max_background_jobs: 2
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:             Options.max_background_compactions: -1
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:             Options.max_subcompactions: 1
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:           Options.writable_file_max_buffer_size: 1048576
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:             Options.delayed_write_rate : 16777216
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:             Options.max_total_wal_size: 0
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:                   Options.stats_dump_period_sec: 600
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:                 Options.stats_persist_period_sec: 600
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:                          Options.max_open_files: -1
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:                          Options.bytes_per_sync: 0
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:                      Options.wal_bytes_per_sync: 0
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:                   Options.strict_bytes_per_sync: 0
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:       Options.compaction_readahead_size: 0
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:                  Options.max_background_flushes: -1
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb: Compression algorithms supported:
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb: #011kZSTD supported: 0
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb: #011kXpressCompression supported: 0
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb: #011kBZip2Compression supported: 0
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb: #011kZSTDNotFinalCompression supported: 0
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb: #011kLZ4Compression supported: 1
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb: #011kZlibCompression supported: 1
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb: #011kLZ4HCCompression supported: 1
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb: #011kSnappyCompression supported: 1
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb: Fast CRC32 supported: Supported on x86
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb: DMutex implementation: pthread_mutex_t
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: /var/lib/ceph/mon/ceph-compute-1/store.db/MANIFEST-000005
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:           Options.merge_operator: 
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:        Options.compaction_filter: None
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:        Options.compaction_filter_factory: None
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:  Options.sst_partitioner_factory: None
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55fbbecda5c0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55fbbecff350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:        Options.write_buffer_size: 33554432
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:  Options.max_write_buffer_number: 2
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:          Options.compression: NoCompression
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:       Options.prefix_extractor: nullptr
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:             Options.num_levels: 7
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:        Options.min_write_buffer_number_to_merge: 1
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:                  Options.compression_opts.level: 32767
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:               Options.compression_opts.strategy: 0
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:                  Options.compression_opts.enabled: false
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:      Options.level0_file_num_compaction_trigger: 4
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:                Options.max_bytes_for_level_base: 268435456
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb: Options.level_compaction_dynamic_level_bytes: 1
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:          Options.max_bytes_for_level_multiplier: 10.000000
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:                        Options.arena_block_size: 1048576
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:                Options.disable_auto_compactions: 0
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:                   Options.inplace_update_support: 0
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:                           Options.bloom_locality: 0
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:                    Options.max_successive_merges: 0
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:                Options.paranoid_file_checks: 0
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:                Options.force_consistency_checks: 1
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:                Options.report_bg_io_stats: 0
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:                               Options.ttl: 2592000
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:                       Options.enable_blob_files: false
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:                           Options.min_blob_size: 0
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:                          Options.blob_file_size: 268435456
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb:                Options.blob_file_starting_level: 0
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:/var/lib/ceph/mon/ceph-compute-1/store.db/MANIFEST-000005 succeeded,manifest_file_number is 5, next_file_number is 7, last_sequence is 0, log_number is 0,prev_log_number is 0,max_column_family is 0,min_log_number_to_keep is 0
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 0
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 79d3ff52-4bd2-4fdc-8b55-33dd9c53d8e0
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014012604679, "job": 1, "event": "recovery_started", "wal_files": [4]}
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #4 mode 2
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014012606628, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 8, "file_size": 1648, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 1, "largest_seqno": 5, "table_properties": {"data_size": 523, "index_size": 31, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 115, "raw_average_key_size": 23, "raw_value_size": 401, "raw_average_value_size": 80, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765014012, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79d3ff52-4bd2-4fdc-8b55-33dd9c53d8e0", "db_session_id": "1TK25AVRA1WQDS4JHM8T", "orig_file_number": 8, "seqno_to_time_mapping": "N/A"}}
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014012606736, "job": 1, "event": "recovery_finished"}
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb: [db/version_set.cc:5047] Creating manifest 10
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000004.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x55fbbed00e00
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb: DB pointer 0x55fbbee0a000
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: mon.compute-1 does not exist in monmap, will attempt to join an existing cluster
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 0.0 total, 0.0 interval#012Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s#012Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      1/0    1.61 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0#012 Sum      1/0    1.61 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.0 total, 0.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.10 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.10 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55fbbecff350#2 capacity: 512.00 MB usage: 0.22 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 0 last_secs: 5.7e-05 secs_since: 0#012Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.11 KB,2.08616e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: using public_addr v2:192.168.122.101:0/0 -> [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0]
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: starting mon.compute-1 rank -1 at public addrs [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0] at bind addrs [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0] mon_data /var/lib/ceph/mon/ceph-compute-1 fsid 5ecd3f74-dade-5fc4-92ce-8950ae424258
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: mon.compute-1@-1(???) e0 preinit fsid 5ecd3f74-dade-5fc4-92ce-8950ae424258
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: mon.compute-1@-1(synchronizing).mds e1 new map
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: mon.compute-1@-1(synchronizing).mds e1 print_map#012e1#012btime 2025-12-06T09:37:41:285728+0000#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012legacy client fscid: -1#012 #012No filesystems configured
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: mon.compute-1@-1(synchronizing).osd e0 _set_cache_ratios kv ratio 0.25 inc ratio 0.375 full ratio 0.375
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: mon.compute-1@-1(synchronizing).osd e0 register_cache_with_pcm pcm target: 2147483648 pcm max: 1020054732 pcm min: 134217728 inc_osd_cache size: 1
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: mon.compute-1@-1(synchronizing).osd e1 e1: 0 total, 0 up, 0 in
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: mon.compute-1@-1(synchronizing).osd e2 e2: 0 total, 0 up, 0 in
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: mon.compute-1@-1(synchronizing).osd e3 e3: 0 total, 0 up, 0 in
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: mon.compute-1@-1(synchronizing).osd e4 e4: 1 total, 0 up, 1 in
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: mon.compute-1@-1(synchronizing).osd e5 e5: 2 total, 0 up, 2 in
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: mon.compute-1@-1(synchronizing).osd e6 e6: 2 total, 0 up, 2 in
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: mon.compute-1@-1(synchronizing).osd e7 e7: 2 total, 0 up, 2 in
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: mon.compute-1@-1(synchronizing).osd e8 e8: 2 total, 0 up, 2 in
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: mon.compute-1@-1(synchronizing).osd e9 e9: 2 total, 0 up, 2 in
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: mon.compute-1@-1(synchronizing).osd e10 e10: 2 total, 1 up, 2 in
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: mon.compute-1@-1(synchronizing).osd e11 e11: 2 total, 1 up, 2 in
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: mon.compute-1@-1(synchronizing).osd e12 e12: 2 total, 2 up, 2 in
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: mon.compute-1@-1(synchronizing).osd e13 e13: 2 total, 2 up, 2 in
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: mon.compute-1@-1(synchronizing).osd e14 e14: 2 total, 2 up, 2 in
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: mon.compute-1@-1(synchronizing).osd e15 e15: 2 total, 2 up, 2 in
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: mon.compute-1@-1(synchronizing).osd e16 e16: 2 total, 2 up, 2 in
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: mon.compute-1@-1(synchronizing).osd e17 e17: 2 total, 2 up, 2 in
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: mon.compute-1@-1(synchronizing).osd e18 e18: 2 total, 2 up, 2 in
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: mon.compute-1@-1(synchronizing).osd e19 e19: 2 total, 2 up, 2 in
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: mon.compute-1@-1(synchronizing).osd e20 e20: 2 total, 2 up, 2 in
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: mon.compute-1@-1(synchronizing).osd e21 e21: 2 total, 2 up, 2 in
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: mon.compute-1@-1(synchronizing).osd e22 e22: 2 total, 2 up, 2 in
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: mon.compute-1@-1(synchronizing).osd e23 e23: 2 total, 2 up, 2 in
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: mon.compute-1@-1(synchronizing).osd e24 e24: 2 total, 2 up, 2 in
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: mon.compute-1@-1(synchronizing).osd e25 e25: 2 total, 2 up, 2 in
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: mon.compute-1@-1(synchronizing).osd e26 e26: 2 total, 2 up, 2 in
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: mon.compute-1@-1(synchronizing).osd e27 e27: 2 total, 2 up, 2 in
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: mon.compute-1@-1(synchronizing).osd e27 crush map has features 3314933000852226048, adjusting msgr requires
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: mon.compute-1@-1(synchronizing).osd e27 crush map has features 288514051259236352, adjusting msgr requires
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: mon.compute-1@-1(synchronizing).osd e27 crush map has features 288514051259236352, adjusting msgr requires
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: mon.compute-1@-1(synchronizing).osd e27 crush map has features 288514051259236352, adjusting msgr requires
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "images", "var": "pg_num", "val": "32"}]': finished
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: from='client.? 192.168.122.100:0/2735601092' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "volumes", "app": "rbd"}]': finished
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num_actual", "val": "32"}]': finished
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: Health check cleared: CEPHADM_APPLY_SPEC_FAIL (was: Failed to apply 2 service(s): mon,mgr)
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: from='client.? 192.168.122.100:0/250124401' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "backups", "app": "rbd"}]: dispatch
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: from='client.? 192.168.122.100:0/250124401' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "backups", "app": "rbd"}]': finished
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pg_num_actual", "val": "32"}]: dispatch
Dec  6 04:40:12 np0005548916 ceph-mon[79770]: mon.compute-1@-1(synchronizing).paxosservice(auth 1..7) refresh upgraded, format 0 -> 3
Dec  6 04:40:13 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 3.17 scrub starts
Dec  6 04:40:13 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 3.17 scrub ok
Dec  6 04:40:13 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 3.12 scrub starts
Dec  6 04:40:13 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 3.12 scrub ok
Dec  6 04:40:14 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 3.11 deep-scrub starts
Dec  6 04:40:14 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 3.11 deep-scrub ok
Dec  6 04:40:15 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 3.10 scrub starts
Dec  6 04:40:15 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 3.10 scrub ok
Dec  6 04:40:16 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 3.13 scrub starts
Dec  6 04:40:16 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 3.13 scrub ok
Dec  6 04:40:17 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 3.f scrub starts
Dec  6 04:40:17 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 3.f scrub ok
Dec  6 04:40:18 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 3.e scrub starts
Dec  6 04:40:18 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 3.e scrub ok
Dec  6 04:40:19 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.1f( empty local-lis/les=19/20 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:40:19 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.13( empty local-lis/les=19/20 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:40:19 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.1e( empty local-lis/les=19/20 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:40:19 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.10( empty local-lis/les=19/20 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:40:19 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.12( empty local-lis/les=19/20 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:40:19 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.15( empty local-lis/les=19/20 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:40:19 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.14( empty local-lis/les=19/20 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:40:19 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.17( empty local-lis/les=19/20 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:40:19 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.11( empty local-lis/les=19/20 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:40:19 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.16( empty local-lis/les=19/20 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:40:19 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.9( empty local-lis/les=19/20 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:40:19 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.8( empty local-lis/les=19/20 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:40:19 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.b( empty local-lis/les=19/20 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:40:19 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.a( empty local-lis/les=19/20 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:40:19 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.d( empty local-lis/les=19/20 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:40:19 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.c( empty local-lis/les=19/20 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:40:19 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.6( empty local-lis/les=19/20 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:40:19 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.1( empty local-lis/les=19/20 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:40:19 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.3( empty local-lis/les=19/20 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:40:19 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.7( empty local-lis/les=19/20 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:40:19 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.4( empty local-lis/les=19/20 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:40:19 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.5( empty local-lis/les=19/20 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:40:19 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.2( empty local-lis/les=19/20 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:40:19 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.e( empty local-lis/les=19/20 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:40:19 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.f( empty local-lis/les=19/20 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:40:19 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.1c( empty local-lis/les=19/20 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:40:19 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.1d( empty local-lis/les=19/20 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:40:19 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.1a( empty local-lis/les=19/20 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:40:19 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.1b( empty local-lis/les=19/20 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:40:19 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.19( empty local-lis/les=19/20 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:40:19 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.18( empty local-lis/les=19/20 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:40:19 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.1f( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:40:19 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.13( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:40:19 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.12( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:40:19 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.10( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:40:19 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.11( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:40:19 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.15( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:40:19 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.17( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:40:19 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.14( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:40:19 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.16( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:40:19 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.9( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:40:19 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.b( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:40:19 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.a( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:40:19 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.c( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:40:19 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.d( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:40:19 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.6( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:40:19 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.1e( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:40:19 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.0( empty local-lis/les=27/28 n=0 ec=19/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:40:19 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.1( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:40:19 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.3( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:40:19 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.8( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:40:19 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.4( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:40:19 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.7( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:40:19 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.5( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:40:19 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.e( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:40:19 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.f( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:40:19 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.2( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:40:19 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.1c( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:40:19 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.1d( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:40:19 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.1a( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:40:19 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.1b( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:40:19 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.19( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:40:19 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.18( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:40:19 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 3.d scrub starts
Dec  6 04:40:19 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 3.d scrub ok
Dec  6 04:40:20 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 3.b deep-scrub starts
Dec  6 04:40:20 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 3.b deep-scrub ok
Dec  6 04:40:22 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 3.7 deep-scrub starts
Dec  6 04:40:22 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 3.7 deep-scrub ok
Dec  6 04:40:22 np0005548916 ceph-mon[79770]: mon.compute-1@-1(probing) e3  my rank is now 2 (was -1)
Dec  6 04:40:22 np0005548916 ceph-mon[79770]: log_channel(cluster) log [INF] : mon.compute-1 calling monitor election
Dec  6 04:40:22 np0005548916 ceph-mon[79770]: paxos.2).electionLogic(0) init, first boot, initializing epoch at 1 
Dec  6 04:40:22 np0005548916 ceph-mon[79770]: mon.compute-1@2(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec  6 04:40:23 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 3.a deep-scrub starts
Dec  6 04:40:23 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 3.a deep-scrub ok
Dec  6 04:40:24 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 3.0 scrub starts
Dec  6 04:40:24 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 3.0 scrub ok
Dec  6 04:40:25 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 3.c scrub starts
Dec  6 04:40:25 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 3.c scrub ok
Dec  6 04:40:25 np0005548916 ceph-mon[79770]: mon.compute-1@2(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec  6 04:40:25 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={4=support erasure code pools,5=new-style osdmap encoding,6=support isa/lrc erasure code,7=support shec erasure code}
Dec  6 04:40:25 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={8=support monmap features,9=luminous ondisk layout,10=mimic ondisk layout,11=nautilus ondisk layout,12=octopus ondisk layout,13=pacific ondisk layout,14=quincy ondisk layout,15=reef ondisk layout,16=squid ondisk layout}
Dec  6 04:40:25 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e28 e28: 2 total, 2 up, 2 in
Dec  6 04:40:26 np0005548916 ceph-mon[79770]: Deploying daemon mon.compute-1 on compute-1
Dec  6 04:40:26 np0005548916 ceph-mon[79770]: from='client.? 192.168.122.100:0/3524701111' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "images", "app": "rbd"}]: dispatch
Dec  6 04:40:26 np0005548916 ceph-mon[79770]: mon.compute-0 calling monitor election
Dec  6 04:40:26 np0005548916 ceph-mon[79770]: mon.compute-0 is new leader, mons compute-0,compute-2 in quorum (ranks 0,1)
Dec  6 04:40:26 np0005548916 ceph-mon[79770]: mon.compute-2 calling monitor election
Dec  6 04:40:26 np0005548916 ceph-mon[79770]: Health detail: HEALTH_WARN 3 pool(s) do not have an application enabled
Dec  6 04:40:26 np0005548916 ceph-mon[79770]: [WRN] POOL_APP_NOT_ENABLED: 3 pool(s) do not have an application enabled
Dec  6 04:40:26 np0005548916 ceph-mon[79770]:    application not enabled on pool 'images'
Dec  6 04:40:26 np0005548916 ceph-mon[79770]:    application not enabled on pool 'cephfs.cephfs.meta'
Dec  6 04:40:26 np0005548916 ceph-mon[79770]:    application not enabled on pool 'cephfs.cephfs.data'
Dec  6 04:40:26 np0005548916 ceph-mon[79770]:    use 'ceph osd pool application enable <pool-name> <app-name>', where <app-name> is 'cephfs', 'rbd', 'rgw', or freeform for custom applications.
Dec  6 04:40:26 np0005548916 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:40:26 np0005548916 ceph-mon[79770]: from='client.? 192.168.122.100:0/3524701111' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "images", "app": "rbd"}]': finished
Dec  6 04:40:26 np0005548916 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:40:26 np0005548916 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:40:26 np0005548916 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:40:26 np0005548916 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.compute-2.oazbvn", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch
Dec  6 04:40:26 np0005548916 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "auth get-or-create", "entity": "mgr.compute-2.oazbvn", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]': finished
Dec  6 04:40:26 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 3.4 scrub starts
Dec  6 04:40:26 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 3.4 scrub ok
Dec  6 04:40:26 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec  6 04:40:26 np0005548916 ceph-mon[79770]: mgrc update_daemon_metadata mon.compute-1 metadata {addrs=[v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0],arch=x86_64,ceph_release=squid,ceph_version=ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable),ceph_version_short=19.2.3,compression_algorithms=none, snappy, zlib, zstd, lz4,container_hostname=compute-1,container_image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec,cpu=AMD EPYC-Rome Processor,device_ids=,device_paths=vda=/dev/disk/by-path/pci-0000:00:04.0,devices=vda,distro=centos,distro_description=CentOS Stream 9,distro_version=9,hostname=compute-1,kernel_description=#1 SMP PREEMPT_DYNAMIC Fri Nov 28 14:01:17 UTC 2025,kernel_version=5.14.0-645.el9.x86_64,mem_swap_kb=1048572,mem_total_kb=7864312,os=Linux}
Dec  6 04:40:26 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e29 e29: 2 total, 2 up, 2 in
Dec  6 04:40:26 np0005548916 ceph-mon[79770]: mon.compute-0 calling monitor election
Dec  6 04:40:26 np0005548916 ceph-mon[79770]: mon.compute-2 calling monitor election
Dec  6 04:40:26 np0005548916 ceph-mon[79770]: from='client.? 192.168.122.100:0/1898003818' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.meta", "app": "cephfs"}]: dispatch
Dec  6 04:40:26 np0005548916 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "backups", "var": "pgp_num_actual", "val": "32"}]: dispatch
Dec  6 04:40:26 np0005548916 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pgp_num_actual", "val": "32"}]: dispatch
Dec  6 04:40:26 np0005548916 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "vms", "var": "pgp_num_actual", "val": "32"}]: dispatch
Dec  6 04:40:26 np0005548916 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pgp_num_actual", "val": "32"}]: dispatch
Dec  6 04:40:26 np0005548916 ceph-mon[79770]: mon.compute-0 is new leader, mons compute-0,compute-2,compute-1 in quorum (ranks 0,1,2)
Dec  6 04:40:26 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[5.11( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=29 pruub=8.702283859s) [1] r=-1 lpr=29 pi=[27,29)/1 crt=0'0 mlcod 0'0 active pruub 59.330039978s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:40:26 np0005548916 ceph-mon[79770]: Health detail: HEALTH_WARN 3 pool(s) do not have an application enabled
Dec  6 04:40:26 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[5.11( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=29 pruub=8.702230453s) [1] r=-1 lpr=29 pi=[27,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 59.330039978s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:40:26 np0005548916 ceph-mon[79770]: [WRN] POOL_APP_NOT_ENABLED: 3 pool(s) do not have an application enabled
Dec  6 04:40:26 np0005548916 ceph-mon[79770]:    application not enabled on pool 'images'
Dec  6 04:40:26 np0005548916 ceph-mon[79770]:    application not enabled on pool 'cephfs.cephfs.meta'
Dec  6 04:40:26 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[5.1f( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=29 pruub=8.698268890s) [1] r=-1 lpr=29 pi=[27,29)/1 crt=0'0 mlcod 0'0 active pruub 59.326400757s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:40:26 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[4.1f( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=29 pruub=13.578024864s) [1] r=-1 lpr=29 pi=[25,29)/1 crt=0'0 mlcod 0'0 active pruub 64.206153870s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:40:26 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[5.1f( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=29 pruub=8.698243141s) [1] r=-1 lpr=29 pi=[27,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 59.326400757s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:40:26 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[4.1f( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=29 pruub=13.577967644s) [1] r=-1 lpr=29 pi=[25,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 64.206153870s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:40:26 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[3.16( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=29 pruub=12.572353363s) [1] r=-1 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 active pruub 63.200588226s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:40:26 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[3.16( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=29 pruub=12.572320938s) [1] r=-1 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 63.200588226s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:40:26 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[3.15( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=29 pruub=12.572243690s) [1] r=-1 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 active pruub 63.200527191s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:40:26 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[3.15( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=29 pruub=12.572222710s) [1] r=-1 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 63.200527191s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:40:26 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[5.10( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=29 pruub=8.701685905s) [1] r=-1 lpr=29 pi=[27,29)/1 crt=0'0 mlcod 0'0 active pruub 59.330024719s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:40:26 np0005548916 ceph-mon[79770]:    application not enabled on pool 'cephfs.cephfs.data'
Dec  6 04:40:26 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[3.14( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=29 pruub=12.572280884s) [1] r=-1 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 active pruub 63.200660706s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:40:26 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[5.10( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=29 pruub=8.701644897s) [1] r=-1 lpr=29 pi=[27,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 59.330024719s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:40:26 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[5.15( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=29 pruub=8.701715469s) [1] r=-1 lpr=29 pi=[27,29)/1 crt=0'0 mlcod 0'0 active pruub 59.330139160s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:40:26 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[3.14( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=29 pruub=12.572256088s) [1] r=-1 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 63.200660706s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:40:26 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[5.15( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=29 pruub=8.701698303s) [1] r=-1 lpr=29 pi=[27,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 59.330139160s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:40:26 np0005548916 ceph-mon[79770]:    use 'ceph osd pool application enable <pool-name> <app-name>', where <app-name> is 'cephfs', 'rbd', 'rgw', or freeform for custom applications.
Dec  6 04:40:26 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[3.13( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=29 pruub=12.572258949s) [1] r=-1 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 active pruub 63.200714111s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:40:26 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[4.13( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=29 pruub=13.578099251s) [1] r=-1 lpr=29 pi=[25,29)/1 crt=0'0 mlcod 0'0 active pruub 64.206558228s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:40:26 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[3.13( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=29 pruub=12.572238922s) [1] r=-1 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 63.200714111s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:40:26 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[4.13( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=29 pruub=13.578079224s) [1] r=-1 lpr=29 pi=[25,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 64.206558228s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:40:26 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[3.11( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=29 pruub=12.572110176s) [1] r=-1 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 active pruub 63.200721741s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:40:26 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[4.15( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=29 pruub=13.578001022s) [1] r=-1 lpr=29 pi=[25,29)/1 crt=0'0 mlcod 0'0 active pruub 64.206657410s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:40:26 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[3.11( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=29 pruub=12.572064400s) [1] r=-1 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 63.200721741s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:40:26 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[5.16( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=29 pruub=8.701502800s) [1] r=-1 lpr=29 pi=[27,29)/1 crt=0'0 mlcod 0'0 active pruub 59.330177307s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:40:26 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[4.15( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=29 pruub=13.577963829s) [1] r=-1 lpr=29 pi=[25,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 64.206657410s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:40:26 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[3.10( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=29 pruub=12.572211266s) [1] r=-1 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 active pruub 63.200912476s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:40:26 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[5.9( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=29 pruub=8.701452255s) [1] r=-1 lpr=29 pi=[27,29)/1 crt=0'0 mlcod 0'0 active pruub 59.330184937s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:40:26 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[3.10( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=29 pruub=12.572188377s) [1] r=-1 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 63.200912476s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:40:26 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[5.16( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=29 pruub=8.701453209s) [1] r=-1 lpr=29 pi=[27,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 59.330177307s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:40:26 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[5.9( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=29 pruub=8.701428413s) [1] r=-1 lpr=29 pi=[27,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 59.330184937s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:40:26 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[3.f( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=29 pruub=12.572074890s) [1] r=-1 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 active pruub 63.200874329s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:40:26 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[4.8( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=29 pruub=13.577779770s) [1] r=-1 lpr=29 pi=[25,29)/1 crt=0'0 mlcod 0'0 active pruub 64.206588745s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:40:26 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[3.f( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=29 pruub=12.572056770s) [1] r=-1 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 63.200874329s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:40:26 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[4.8( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=29 pruub=13.577763557s) [1] r=-1 lpr=29 pi=[25,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 64.206588745s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:40:26 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[4.9( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=29 pruub=13.577795029s) [1] r=-1 lpr=29 pi=[25,29)/1 crt=0'0 mlcod 0'0 active pruub 64.206672668s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:40:26 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[3.e( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=29 pruub=12.571990013s) [1] r=-1 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 active pruub 63.200889587s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:40:26 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[4.9( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=29 pruub=13.577779770s) [1] r=-1 lpr=29 pi=[25,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 64.206672668s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:40:26 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[3.d( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=29 pruub=12.571936607s) [1] r=-1 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 active pruub 63.200893402s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:40:26 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[3.e( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=29 pruub=12.571940422s) [1] r=-1 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 63.200889587s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:40:26 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[4.a( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=29 pruub=13.577631950s) [1] r=-1 lpr=29 pi=[25,29)/1 crt=0'0 mlcod 0'0 active pruub 64.206611633s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:40:26 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[3.d( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=29 pruub=12.571908951s) [1] r=-1 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 63.200893402s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:40:26 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[4.a( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=29 pruub=13.577610016s) [1] r=-1 lpr=29 pi=[25,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 64.206611633s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:40:26 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[3.c( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=29 pruub=12.571833611s) [1] r=-1 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 active pruub 63.200904846s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:40:26 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[4.c( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=29 pruub=13.577974319s) [1] r=-1 lpr=29 pi=[25,29)/1 crt=0'0 mlcod 0'0 active pruub 64.207054138s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:40:26 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[4.c( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=29 pruub=13.577961922s) [1] r=-1 lpr=29 pi=[25,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 64.207054138s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:40:26 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[3.c( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=29 pruub=12.571818352s) [1] r=-1 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 63.200904846s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:40:26 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[3.a( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=29 pruub=12.571742058s) [1] r=-1 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 active pruub 63.200965881s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:40:26 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[4.d( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=29 pruub=13.577619553s) [1] r=-1 lpr=29 pi=[25,29)/1 crt=0'0 mlcod 0'0 active pruub 64.206855774s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:40:26 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[4.d( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=29 pruub=13.577604294s) [1] r=-1 lpr=29 pi=[25,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 64.206855774s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:40:26 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[3.a( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=29 pruub=12.571721077s) [1] r=-1 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 63.200965881s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:40:26 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[5.1( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=29 pruub=8.701019287s) [1] r=-1 lpr=29 pi=[27,29)/1 crt=0'0 mlcod 0'0 active pruub 59.330410004s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:40:26 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[4.1( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=29 pruub=13.577396393s) [1] r=-1 lpr=29 pi=[25,29)/1 crt=0'0 mlcod 0'0 active pruub 64.206832886s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:40:26 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[5.1( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=29 pruub=8.700989723s) [1] r=-1 lpr=29 pi=[27,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 59.330410004s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:40:26 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[3.5( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=29 pruub=12.578357697s) [1] r=-1 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 active pruub 63.207824707s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:40:26 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[4.1( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=29 pruub=13.577382088s) [1] r=-1 lpr=29 pi=[25,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 64.206832886s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:40:26 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[3.5( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=29 pruub=12.578341484s) [1] r=-1 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 63.207824707s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:40:26 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[5.7( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=29 pruub=8.700942039s) [1] r=-1 lpr=29 pi=[27,29)/1 crt=0'0 mlcod 0'0 active pruub 59.330444336s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:40:26 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[5.7( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=29 pruub=8.700921059s) [1] r=-1 lpr=29 pi=[27,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 59.330444336s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:40:26 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[5.4( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=29 pruub=8.700901985s) [1] r=-1 lpr=29 pi=[27,29)/1 crt=0'0 mlcod 0'0 active pruub 59.330451965s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:40:26 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[4.5( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=29 pruub=13.577339172s) [1] r=-1 lpr=29 pi=[25,29)/1 crt=0'0 mlcod 0'0 active pruub 64.206924438s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:40:26 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[5.4( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=29 pruub=8.700880051s) [1] r=-1 lpr=29 pi=[27,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 59.330451965s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:40:26 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[4.5( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=29 pruub=13.577327728s) [1] r=-1 lpr=29 pi=[25,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 64.206924438s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:40:26 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[3.3( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=29 pruub=12.578140259s) [1] r=-1 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 active pruub 63.207798004s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:40:26 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[3.3( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=29 pruub=12.578126907s) [1] r=-1 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 63.207798004s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:40:26 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[5.2( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=29 pruub=8.700796127s) [1] r=-1 lpr=29 pi=[27,29)/1 crt=0'0 mlcod 0'0 active pruub 59.330554962s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:40:26 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[5.e( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=29 pruub=8.700776100s) [1] r=-1 lpr=29 pi=[27,29)/1 crt=0'0 mlcod 0'0 active pruub 59.330554962s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:40:26 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[5.2( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=29 pruub=8.700767517s) [1] r=-1 lpr=29 pi=[27,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 59.330554962s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:40:26 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[5.f( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=29 pruub=8.700776100s) [1] r=-1 lpr=29 pi=[27,29)/1 crt=0'0 mlcod 0'0 active pruub 59.330581665s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:40:26 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[5.e( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=29 pruub=8.700756073s) [1] r=-1 lpr=29 pi=[27,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 59.330554962s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:40:26 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[5.f( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=29 pruub=8.700746536s) [1] r=-1 lpr=29 pi=[27,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 59.330581665s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:40:26 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[4.e( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=29 pruub=13.577299118s) [1] r=-1 lpr=29 pi=[25,29)/1 crt=0'0 mlcod 0'0 active pruub 64.207168579s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:40:26 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[3.9( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=29 pruub=12.578253746s) [1] r=-1 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 active pruub 63.208145142s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:40:26 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[4.e( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=29 pruub=13.577277184s) [1] r=-1 lpr=29 pi=[25,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 64.207168579s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:40:26 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[3.9( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=29 pruub=12.578232765s) [1] r=-1 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 63.208145142s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:40:26 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[5.1c( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=29 pruub=8.700592995s) [1] r=-1 lpr=29 pi=[27,29)/1 crt=0'0 mlcod 0'0 active pruub 59.330585480s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:40:26 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[5.1a( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=29 pruub=8.700614929s) [1] r=-1 lpr=29 pi=[27,29)/1 crt=0'0 mlcod 0'0 active pruub 59.330642700s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:40:26 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[5.1c( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=29 pruub=8.700572968s) [1] r=-1 lpr=29 pi=[27,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 59.330585480s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:40:26 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[3.1a( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=29 pruub=12.578066826s) [1] r=-1 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 active pruub 63.208156586s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:40:26 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[5.1a( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=29 pruub=8.700556755s) [1] r=-1 lpr=29 pi=[27,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 59.330642700s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:40:26 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[4.1b( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=29 pruub=13.579343796s) [1] r=-1 lpr=29 pi=[25,29)/1 crt=0'0 mlcod 0'0 active pruub 64.209487915s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:40:26 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[3.1a( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=29 pruub=12.578047752s) [1] r=-1 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 63.208156586s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:40:26 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[4.1b( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=29 pruub=13.579326630s) [1] r=-1 lpr=29 pi=[25,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 64.209487915s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:40:26 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[3.1c( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=29 pruub=12.577960014s) [1] r=-1 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 active pruub 63.208160400s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:40:26 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[5.1b( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=29 pruub=8.700434685s) [1] r=-1 lpr=29 pi=[27,29)/1 crt=0'0 mlcod 0'0 active pruub 59.330650330s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:40:26 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[4.1a( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=29 pruub=13.579102516s) [1] r=-1 lpr=29 pi=[25,29)/1 crt=0'0 mlcod 0'0 active pruub 64.209335327s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:40:26 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[5.1b( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=29 pruub=8.700411797s) [1] r=-1 lpr=29 pi=[27,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 59.330650330s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:40:26 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[4.1a( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=29 pruub=13.579081535s) [1] r=-1 lpr=29 pi=[25,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 64.209335327s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:40:26 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[3.1c( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=29 pruub=12.577885628s) [1] r=-1 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 63.208160400s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:40:26 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[3.1d( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=29 pruub=12.578135490s) [1] r=-1 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 active pruub 63.208438873s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:40:26 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[3.1d( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=29 pruub=12.578117371s) [1] r=-1 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 63.208438873s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:40:26 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[5.18( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=29 pruub=8.700207710s) [1] r=-1 lpr=29 pi=[27,29)/1 crt=0'0 mlcod 0'0 active pruub 59.330684662s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:40:26 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[4.18( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=29 pruub=13.579350471s) [1] r=-1 lpr=29 pi=[25,29)/1 crt=0'0 mlcod 0'0 active pruub 64.209854126s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:40:26 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[4.18( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=29 pruub=13.579333305s) [1] r=-1 lpr=29 pi=[25,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 64.209854126s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:40:26 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[5.18( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=29 pruub=8.700164795s) [1] r=-1 lpr=29 pi=[27,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 59.330684662s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:40:26 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[2.19( empty local-lis/les=0/0 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=29) [0] r=0 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:40:26 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[2.15( empty local-lis/les=0/0 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=29) [0] r=0 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:40:26 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[2.13( empty local-lis/les=0/0 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=29) [0] r=0 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:40:26 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[2.d( empty local-lis/les=0/0 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=29) [0] r=0 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:40:26 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[2.c( empty local-lis/les=0/0 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=29) [0] r=0 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:40:26 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[2.10( empty local-lis/les=0/0 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=29) [0] r=0 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:40:26 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[2.e( empty local-lis/les=0/0 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=29) [0] r=0 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:40:26 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[2.1( empty local-lis/les=0/0 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=29) [0] r=0 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:40:26 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[2.4( empty local-lis/les=0/0 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=29) [0] r=0 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:40:26 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[2.6( empty local-lis/les=0/0 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=29) [0] r=0 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:40:26 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[2.9( empty local-lis/les=0/0 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=29) [0] r=0 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:40:26 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[2.a( empty local-lis/les=0/0 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=29) [0] r=0 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:40:26 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[2.1f( empty local-lis/les=0/0 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=29) [0] r=0 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:40:26 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[2.1e( empty local-lis/les=0/0 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=29) [0] r=0 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:40:26 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[2.1b( empty local-lis/les=0/0 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=29) [0] r=0 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:40:27 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 4.1e scrub starts
Dec  6 04:40:27 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 4.1e scrub ok
Dec  6 04:40:27 np0005548916 podman[79899]: 2025-12-06 09:40:27.404950584 +0000 UTC m=+0.043297892 container create aa0c2e021053773e86ba2ee604d884fd82322c9ac8a8b4f77f388cc038511e8f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ecstatic_hermann, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, ceph=True)
Dec  6 04:40:27 np0005548916 systemd[1]: Started libpod-conmon-aa0c2e021053773e86ba2ee604d884fd82322c9ac8a8b4f77f388cc038511e8f.scope.
Dec  6 04:40:27 np0005548916 systemd[1]: Started libcrun container.
Dec  6 04:40:27 np0005548916 podman[79899]: 2025-12-06 09:40:27.386575567 +0000 UTC m=+0.024922895 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  6 04:40:27 np0005548916 podman[79899]: 2025-12-06 09:40:27.489960257 +0000 UTC m=+0.128307635 container init aa0c2e021053773e86ba2ee604d884fd82322c9ac8a8b4f77f388cc038511e8f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ecstatic_hermann, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Dec  6 04:40:27 np0005548916 podman[79899]: 2025-12-06 09:40:27.497676778 +0000 UTC m=+0.136024096 container start aa0c2e021053773e86ba2ee604d884fd82322c9ac8a8b4f77f388cc038511e8f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ecstatic_hermann, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, ceph=True, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  6 04:40:27 np0005548916 podman[79899]: 2025-12-06 09:40:27.502180245 +0000 UTC m=+0.140527603 container attach aa0c2e021053773e86ba2ee604d884fd82322c9ac8a8b4f77f388cc038511e8f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ecstatic_hermann, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, ceph=True)
Dec  6 04:40:27 np0005548916 ecstatic_hermann[79916]: 167 167
Dec  6 04:40:27 np0005548916 systemd[1]: libpod-aa0c2e021053773e86ba2ee604d884fd82322c9ac8a8b4f77f388cc038511e8f.scope: Deactivated successfully.
Dec  6 04:40:27 np0005548916 podman[79899]: 2025-12-06 09:40:27.504448199 +0000 UTC m=+0.142795517 container died aa0c2e021053773e86ba2ee604d884fd82322c9ac8a8b4f77f388cc038511e8f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ecstatic_hermann, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.license=GPLv2)
Dec  6 04:40:27 np0005548916 systemd[1]: var-lib-containers-storage-overlay-919f56925ff5c04dc21b36e230b03c3c9513857396e97c0edb2448e1fd51ab3d-merged.mount: Deactivated successfully.
Dec  6 04:40:27 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e29 _set_new_cache_sizes cache_size:1019933393 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  6 04:40:27 np0005548916 podman[79899]: 2025-12-06 09:40:27.848581931 +0000 UTC m=+0.486929269 container remove aa0c2e021053773e86ba2ee604d884fd82322c9ac8a8b4f77f388cc038511e8f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ecstatic_hermann, io.buildah.version=1.40.1, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  6 04:40:27 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e30 e30: 2 total, 2 up, 2 in
Dec  6 04:40:27 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 30 pg[2.1b( empty local-lis/les=29/30 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=29) [0] r=0 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:40:27 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 30 pg[2.9( empty local-lis/les=29/30 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=29) [0] r=0 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:40:27 np0005548916 ceph-mon[79770]: mon.compute-1 calling monitor election
Dec  6 04:40:27 np0005548916 ceph-mon[79770]: Health check update: 2 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Dec  6 04:40:27 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 30 pg[2.1e( empty local-lis/les=29/30 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=29) [0] r=0 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:40:27 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 30 pg[2.6( empty local-lis/les=29/30 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=29) [0] r=0 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:40:27 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 30 pg[2.1( empty local-lis/les=29/30 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=29) [0] r=0 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:40:27 np0005548916 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "backups", "var": "pgp_num_actual", "val": "32"}]: dispatch
Dec  6 04:40:27 np0005548916 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pgp_num_actual", "val": "32"}]: dispatch
Dec  6 04:40:27 np0005548916 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "vms", "var": "pgp_num_actual", "val": "32"}]: dispatch
Dec  6 04:40:27 np0005548916 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pgp_num_actual", "val": "32"}]: dispatch
Dec  6 04:40:27 np0005548916 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:40:27 np0005548916 ceph-mon[79770]: from='client.? 192.168.122.100:0/1898003818' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.meta", "app": "cephfs"}]': finished
Dec  6 04:40:27 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 30 pg[2.4( empty local-lis/les=29/30 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=29) [0] r=0 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:40:27 np0005548916 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "backups", "var": "pgp_num_actual", "val": "32"}]': finished
Dec  6 04:40:27 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 30 pg[2.1f( empty local-lis/les=29/30 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=29) [0] r=0 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:40:27 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 30 pg[2.d( empty local-lis/les=29/30 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=29) [0] r=0 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:40:27 np0005548916 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "images", "var": "pgp_num_actual", "val": "32"}]': finished
Dec  6 04:40:27 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 30 pg[2.c( empty local-lis/les=29/30 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=29) [0] r=0 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:40:27 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 30 pg[2.a( empty local-lis/les=29/30 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=29) [0] r=0 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:40:27 np0005548916 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "vms", "var": "pgp_num_actual", "val": "32"}]': finished
Dec  6 04:40:27 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 30 pg[2.13( empty local-lis/les=29/30 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=29) [0] r=0 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:40:27 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 30 pg[2.10( empty local-lis/les=29/30 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=29) [0] r=0 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:40:27 np0005548916 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "volumes", "var": "pgp_num_actual", "val": "32"}]': finished
Dec  6 04:40:27 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 30 pg[2.15( empty local-lis/les=29/30 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=29) [0] r=0 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:40:27 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 30 pg[2.19( empty local-lis/les=29/30 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=29) [0] r=0 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:40:27 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 30 pg[2.e( empty local-lis/les=29/30 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=29) [0] r=0 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:40:27 np0005548916 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:40:27 np0005548916 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:40:27 np0005548916 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:40:27 np0005548916 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.compute-1.sauzid", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch
Dec  6 04:40:27 np0005548916 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "auth get-or-create", "entity": "mgr.compute-1.sauzid", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]': finished
Dec  6 04:40:27 np0005548916 ceph-mon[79770]: from='client.? 192.168.122.100:0/21529314' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.data", "app": "cephfs"}]: dispatch
Dec  6 04:40:27 np0005548916 systemd[1]: libpod-conmon-aa0c2e021053773e86ba2ee604d884fd82322c9ac8a8b4f77f388cc038511e8f.scope: Deactivated successfully.
Dec  6 04:40:28 np0005548916 systemd[1]: Reloading.
Dec  6 04:40:28 np0005548916 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:40:28 np0005548916 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:40:28 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 5.1e scrub starts
Dec  6 04:40:28 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 5.1e scrub ok
Dec  6 04:40:28 np0005548916 systemd[1]: Reloading.
Dec  6 04:40:28 np0005548916 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:40:28 np0005548916 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:40:28 np0005548916 systemd[1]: Starting Ceph mgr.compute-1.sauzid for 5ecd3f74-dade-5fc4-92ce-8950ae424258...
Dec  6 04:40:28 np0005548916 ceph-mon[79770]: Deploying daemon mgr.compute-1.sauzid on compute-1
Dec  6 04:40:28 np0005548916 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "backups", "var": "pgp_num_actual", "val": "32"}]': finished
Dec  6 04:40:28 np0005548916 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "images", "var": "pgp_num_actual", "val": "32"}]': finished
Dec  6 04:40:28 np0005548916 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "vms", "var": "pgp_num_actual", "val": "32"}]': finished
Dec  6 04:40:28 np0005548916 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "volumes", "var": "pgp_num_actual", "val": "32"}]': finished
Dec  6 04:40:28 np0005548916 ceph-mon[79770]: from='client.? 192.168.122.100:0/21529314' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.data", "app": "cephfs"}]': finished
Dec  6 04:40:28 np0005548916 podman[80060]: 2025-12-06 09:40:28.922782339 +0000 UTC m=+0.062874754 container create 66d946b34f9046b885a4188f19fa23f79edaa2e9c5ef5e17f29d5748ef54b8c9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, ceph=True, CEPH_REF=squid, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  6 04:40:28 np0005548916 podman[80060]: 2025-12-06 09:40:28.889872609 +0000 UTC m=+0.029965114 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  6 04:40:28 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/11a98e9a6e8fe2a8669b6179c00f600d296c8ae58ca091f39e6db3a2e4116e5b/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  6 04:40:28 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/11a98e9a6e8fe2a8669b6179c00f600d296c8ae58ca091f39e6db3a2e4116e5b/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  6 04:40:28 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/11a98e9a6e8fe2a8669b6179c00f600d296c8ae58ca091f39e6db3a2e4116e5b/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  6 04:40:28 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/11a98e9a6e8fe2a8669b6179c00f600d296c8ae58ca091f39e6db3a2e4116e5b/merged/var/lib/ceph/mgr/ceph-compute-1.sauzid supports timestamps until 2038 (0x7fffffff)
Dec  6 04:40:29 np0005548916 podman[80060]: 2025-12-06 09:40:29.007606718 +0000 UTC m=+0.147699153 container init 66d946b34f9046b885a4188f19fa23f79edaa2e9c5ef5e17f29d5748ef54b8c9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Dec  6 04:40:29 np0005548916 podman[80060]: 2025-12-06 09:40:29.017081472 +0000 UTC m=+0.157173887 container start 66d946b34f9046b885a4188f19fa23f79edaa2e9c5ef5e17f29d5748ef54b8c9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  6 04:40:29 np0005548916 bash[80060]: 66d946b34f9046b885a4188f19fa23f79edaa2e9c5ef5e17f29d5748ef54b8c9
Dec  6 04:40:29 np0005548916 systemd[1]: Started Ceph mgr.compute-1.sauzid for 5ecd3f74-dade-5fc4-92ce-8950ae424258.
Dec  6 04:40:29 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 4.10 scrub starts
Dec  6 04:40:29 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 4.10 scrub ok
Dec  6 04:40:29 np0005548916 ceph-mgr[80080]: set uid:gid to 167:167 (ceph:ceph)
Dec  6 04:40:29 np0005548916 ceph-mgr[80080]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mgr, pid 2
Dec  6 04:40:29 np0005548916 ceph-mgr[80080]: pidfile_write: ignore empty --pid-file
Dec  6 04:40:29 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'alerts'
Dec  6 04:40:29 np0005548916 ceph-mgr[80080]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Dec  6 04:40:29 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'balancer'
Dec  6 04:40:29 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:40:29.230+0000 7fa301e39140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Dec  6 04:40:29 np0005548916 ceph-mgr[80080]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Dec  6 04:40:29 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'cephadm'
Dec  6 04:40:29 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:40:29.314+0000 7fa301e39140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Dec  6 04:40:30 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'crash'
Dec  6 04:40:30 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 4.11 scrub starts
Dec  6 04:40:30 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 4.11 scrub ok
Dec  6 04:40:30 np0005548916 ceph-mgr[80080]: mgr[py] Module crash has missing NOTIFY_TYPES member
Dec  6 04:40:30 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'dashboard'
Dec  6 04:40:30 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:40:30.142+0000 7fa301e39140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Dec  6 04:40:30 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'devicehealth'
Dec  6 04:40:31 np0005548916 ceph-mgr[80080]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Dec  6 04:40:31 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'diskprediction_local'
Dec  6 04:40:31 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:40:31.008+0000 7fa301e39140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Dec  6 04:40:31 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 5.13 deep-scrub starts
Dec  6 04:40:31 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 5.13 deep-scrub ok
Dec  6 04:40:31 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Dec  6 04:40:31 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Dec  6 04:40:31 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]:  from numpy import show_config as show_numpy_config
Dec  6 04:40:31 np0005548916 ceph-mgr[80080]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Dec  6 04:40:31 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'influx'
Dec  6 04:40:31 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:40:31.510+0000 7fa301e39140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Dec  6 04:40:31 np0005548916 ceph-mgr[80080]: mgr[py] Module influx has missing NOTIFY_TYPES member
Dec  6 04:40:31 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'insights'
Dec  6 04:40:31 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:40:31.589+0000 7fa301e39140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Dec  6 04:40:31 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'iostat'
Dec  6 04:40:31 np0005548916 ceph-mgr[80080]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Dec  6 04:40:31 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'k8sevents'
Dec  6 04:40:31 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:40:31.765+0000 7fa301e39140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Dec  6 04:40:32 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 4.12 scrub starts
Dec  6 04:40:32 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 4.12 scrub ok
Dec  6 04:40:32 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'localpool'
Dec  6 04:40:32 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'mds_autoscaler'
Dec  6 04:40:32 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'mirroring'
Dec  6 04:40:32 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'nfs'
Dec  6 04:40:32 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e30 _set_new_cache_sizes cache_size:1020053179 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  6 04:40:33 np0005548916 ceph-mgr[80080]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Dec  6 04:40:33 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'orchestrator'
Dec  6 04:40:33 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:40:33.159+0000 7fa301e39140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Dec  6 04:40:33 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 5.12 deep-scrub starts
Dec  6 04:40:33 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 5.12 deep-scrub ok
Dec  6 04:40:33 np0005548916 ceph-mgr[80080]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Dec  6 04:40:33 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'osd_perf_query'
Dec  6 04:40:33 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:40:33.420+0000 7fa301e39140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Dec  6 04:40:33 np0005548916 ceph-mgr[80080]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Dec  6 04:40:33 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'osd_support'
Dec  6 04:40:33 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:40:33.599+0000 7fa301e39140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Dec  6 04:40:33 np0005548916 ceph-mgr[80080]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Dec  6 04:40:33 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'pg_autoscaler'
Dec  6 04:40:33 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:40:33.691+0000 7fa301e39140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Dec  6 04:40:33 np0005548916 ceph-mgr[80080]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Dec  6 04:40:33 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'progress'
Dec  6 04:40:33 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:40:33.803+0000 7fa301e39140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Dec  6 04:40:33 np0005548916 ceph-mgr[80080]: mgr[py] Module progress has missing NOTIFY_TYPES member
Dec  6 04:40:33 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'prometheus'
Dec  6 04:40:33 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:40:33.893+0000 7fa301e39140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Dec  6 04:40:34 np0005548916 ceph-mon[79770]: Health check cleared: POOL_APP_NOT_ENABLED (was: 2 pool(s) do not have an application enabled)
Dec  6 04:40:34 np0005548916 ceph-mon[79770]: Cluster is now healthy
Dec  6 04:40:34 np0005548916 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:40:34 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 4.14 scrub starts
Dec  6 04:40:34 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 4.14 scrub ok
Dec  6 04:40:34 np0005548916 ceph-mgr[80080]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Dec  6 04:40:34 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'rbd_support'
Dec  6 04:40:34 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:40:34.329+0000 7fa301e39140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Dec  6 04:40:34 np0005548916 ceph-mgr[80080]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Dec  6 04:40:34 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'restful'
Dec  6 04:40:34 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:40:34.546+0000 7fa301e39140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Dec  6 04:40:34 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'rgw'
Dec  6 04:40:35 np0005548916 ceph-mgr[80080]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Dec  6 04:40:35 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'rook'
Dec  6 04:40:35 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:40:35.093+0000 7fa301e39140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Dec  6 04:40:35 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 5.14 scrub starts
Dec  6 04:40:35 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 5.14 scrub ok
Dec  6 04:40:36 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 5.17 scrub starts
Dec  6 04:40:36 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 5.17 scrub ok
Dec  6 04:40:36 np0005548916 ceph-mgr[80080]: mgr[py] Module rook has missing NOTIFY_TYPES member
Dec  6 04:40:36 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'selftest'
Dec  6 04:40:36 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:40:36.294+0000 7fa301e39140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Dec  6 04:40:36 np0005548916 ceph-mgr[80080]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Dec  6 04:40:36 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'snap_schedule'
Dec  6 04:40:36 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:40:36.386+0000 7fa301e39140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Dec  6 04:40:36 np0005548916 ceph-mgr[80080]: mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Dec  6 04:40:36 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'stats'
Dec  6 04:40:36 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:40:36.489+0000 7fa301e39140 -1 mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Dec  6 04:40:36 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'status'
Dec  6 04:40:36 np0005548916 ceph-mgr[80080]: mgr[py] Module status has missing NOTIFY_TYPES member
Dec  6 04:40:36 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'telegraf'
Dec  6 04:40:36 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:40:36.882+0000 7fa301e39140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Dec  6 04:40:36 np0005548916 ceph-mgr[80080]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Dec  6 04:40:36 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'telemetry'
Dec  6 04:40:36 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:40:36.972+0000 7fa301e39140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Dec  6 04:40:37 np0005548916 ceph-mgr[80080]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Dec  6 04:40:37 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'test_orchestrator'
Dec  6 04:40:37 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:40:37.181+0000 7fa301e39140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Dec  6 04:40:37 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 4.16 scrub starts
Dec  6 04:40:37 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 4.16 scrub ok
Dec  6 04:40:37 np0005548916 ceph-mgr[80080]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Dec  6 04:40:37 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'volumes'
Dec  6 04:40:37 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:40:37.457+0000 7fa301e39140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Dec  6 04:40:37 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e30 _set_new_cache_sizes cache_size:1020054712 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  6 04:40:38 np0005548916 ceph-mgr[80080]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Dec  6 04:40:38 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'zabbix'
Dec  6 04:40:38 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:40:38.079+0000 7fa301e39140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Dec  6 04:40:38 np0005548916 ceph-mgr[80080]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Dec  6 04:40:38 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:40:38.175+0000 7fa301e39140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Dec  6 04:40:38 np0005548916 ceph-mgr[80080]: ms_deliver_dispatch: unhandled message 0x56071dbacd00 mon_map magic: 0 from mon.2 v2:192.168.122.101:3300/0
Dec  6 04:40:38 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 4.17 deep-scrub starts
Dec  6 04:40:38 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 4.17 deep-scrub ok
Dec  6 04:40:39 np0005548916 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:40:39 np0005548916 ceph-mon[79770]: from='client.? 192.168.122.100:0/2318794964' entity='client.admin' cmd=[{"prefix": "config assimilate-conf"}]: dispatch
Dec  6 04:40:39 np0005548916 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:40:39 np0005548916 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:40:39 np0005548916 ceph-mon[79770]: from='client.? 192.168.122.100:0/2318794964' entity='client.admin' cmd='[{"prefix": "config assimilate-conf"}]': finished
Dec  6 04:40:39 np0005548916 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:40:39 np0005548916 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.compute-2", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch
Dec  6 04:40:39 np0005548916 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "auth get-or-create", "entity": "client.crash.compute-2", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]': finished
Dec  6 04:40:39 np0005548916 ceph-mon[79770]: Deploying daemon crash.compute-2 on compute-2
Dec  6 04:40:39 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 5.8 scrub starts
Dec  6 04:40:39 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 5.8 scrub ok
Dec  6 04:40:40 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 5.a scrub starts
Dec  6 04:40:40 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 5.a scrub ok
Dec  6 04:40:41 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 4.b scrub starts
Dec  6 04:40:41 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 4.b scrub ok
Dec  6 04:40:42 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 5.d scrub starts
Dec  6 04:40:42 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 5.d scrub ok
Dec  6 04:40:42 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e30 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  6 04:40:43 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 5.c scrub starts
Dec  6 04:40:43 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 5.c scrub ok
Dec  6 04:40:44 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 5.6 deep-scrub starts
Dec  6 04:40:44 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 5.6 deep-scrub ok
Dec  6 04:40:45 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 4.7 scrub starts
Dec  6 04:40:45 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 4.7 scrub ok
Dec  6 04:40:45 np0005548916 ceph-mon[79770]: from='client.? 192.168.122.100:0/1940510154' entity='client.admin' 
Dec  6 04:40:45 np0005548916 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:40:45 np0005548916 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:40:45 np0005548916 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:40:46 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 5.b scrub starts
Dec  6 04:40:46 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 5.b scrub ok
Dec  6 04:40:47 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 4.0 scrub starts
Dec  6 04:40:47 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 4.0 scrub ok
Dec  6 04:40:47 np0005548916 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:40:47 np0005548916 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  6 04:40:47 np0005548916 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  6 04:40:47 np0005548916 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:40:47 np0005548916 ceph-mon[79770]: Saving service rgw.rgw spec with placement compute-0;compute-1;compute-2
Dec  6 04:40:47 np0005548916 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:40:47 np0005548916 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:40:47 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e30 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  6 04:40:48 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 5.0 scrub starts
Dec  6 04:40:48 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 5.0 scrub ok
Dec  6 04:40:48 np0005548916 systemd[72560]: Starting Mark boot as successful...
Dec  6 04:40:48 np0005548916 systemd[72560]: Finished Mark boot as successful.
Dec  6 04:40:48 np0005548916 ceph-mon[79770]: Saving service ingress.rgw.default spec with placement count:2
Dec  6 04:40:49 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e31 e31: 3 total, 2 up, 3 in
Dec  6 04:40:49 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 3.6 scrub starts
Dec  6 04:40:49 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 3.6 scrub ok
Dec  6 04:40:49 np0005548916 ceph-mon[79770]: Saving service node-exporter spec with placement *
Dec  6 04:40:49 np0005548916 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:40:49 np0005548916 ceph-mon[79770]: Saving service grafana spec with placement compute-0;count:1
Dec  6 04:40:49 np0005548916 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:40:49 np0005548916 ceph-mon[79770]: Saving service prometheus spec with placement compute-0;count:1
Dec  6 04:40:49 np0005548916 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:40:49 np0005548916 ceph-mon[79770]: Saving service alertmanager spec with placement compute-0;count:1
Dec  6 04:40:49 np0005548916 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:40:49 np0005548916 ceph-mon[79770]: from='client.? ' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "b46cc65b-25ba-490a-8b8e-91e4407f3aed"}]: dispatch
Dec  6 04:40:49 np0005548916 ceph-mon[79770]: from='client.? 192.168.122.102:0/569971095' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "b46cc65b-25ba-490a-8b8e-91e4407f3aed"}]: dispatch
Dec  6 04:40:49 np0005548916 ceph-mon[79770]: from='client.? ' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "b46cc65b-25ba-490a-8b8e-91e4407f3aed"}]': finished
Dec  6 04:40:50 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 5.3 deep-scrub starts
Dec  6 04:40:50 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 5.3 deep-scrub ok
Dec  6 04:40:51 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 4.2 scrub starts
Dec  6 04:40:51 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 4.2 scrub ok
Dec  6 04:40:51 np0005548916 ceph-mon[79770]: from='client.? 192.168.122.100:0/4267326554' entity='client.admin' 
Dec  6 04:40:52 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 4.6 deep-scrub starts
Dec  6 04:40:52 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 4.6 deep-scrub ok
Dec  6 04:40:52 np0005548916 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:40:52 np0005548916 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:40:52 np0005548916 ceph-mon[79770]: from='client.? 192.168.122.100:0/821839877' entity='client.admin' 
Dec  6 04:40:52 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e31 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  6 04:40:53 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 3.2 scrub starts
Dec  6 04:40:53 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 3.2 scrub ok
Dec  6 04:40:54 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 5.5 scrub starts
Dec  6 04:40:54 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 5.5 scrub ok
Dec  6 04:40:55 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 4.4 scrub starts
Dec  6 04:40:55 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 4.4 scrub ok
Dec  6 04:40:55 np0005548916 ceph-mon[79770]: from='client.? 192.168.122.100:0/1482144347' entity='client.admin' 
Dec  6 04:40:55 np0005548916 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "osd.2"}]: dispatch
Dec  6 04:40:56 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 4.3 scrub starts
Dec  6 04:40:56 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 4.3 scrub ok
Dec  6 04:40:56 np0005548916 python3[80138]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a -f 'name=ceph-?(.*)-mgr.*' --format \{\{\.Command\}\} --no-trunc#012 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:40:56 np0005548916 ceph-mon[79770]: Deploying daemon osd.2 on compute-2
Dec  6 04:40:56 np0005548916 ceph-mon[79770]: from='client.? 192.168.122.100:0/3512142115' entity='client.admin' 
Dec  6 04:40:57 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 3.1 scrub starts
Dec  6 04:40:57 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 3.1 scrub ok
Dec  6 04:40:57 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e31 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  6 04:40:58 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 3.8 scrub starts
Dec  6 04:40:58 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 3.8 scrub ok
Dec  6 04:40:58 np0005548916 ceph-mon[79770]: from='client.? 192.168.122.100:0/2451230512' entity='client.admin' 
Dec  6 04:40:59 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 4.1d scrub starts
Dec  6 04:40:59 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 4.1d scrub ok
Dec  6 04:40:59 np0005548916 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:40:59 np0005548916 ceph-mon[79770]: from='client.? 192.168.122.100:0/2111286861' entity='client.admin' 
Dec  6 04:40:59 np0005548916 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:00 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 4.1c scrub starts
Dec  6 04:41:00 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 4.1c scrub ok
Dec  6 04:41:00 np0005548916 ceph-mon[79770]: from='client.? 192.168.122.100:0/2854219236' entity='client.admin' cmd=[{"prefix": "mgr module disable", "module": "dashboard"}]: dispatch
Dec  6 04:41:01 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 5.1d scrub starts
Dec  6 04:41:01 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 5.1d scrub ok
Dec  6 04:41:01 np0005548916 ceph-mon[79770]: from='client.? 192.168.122.100:0/2854219236' entity='client.admin' cmd='[{"prefix": "mgr module disable", "module": "dashboard"}]': finished
Dec  6 04:41:01 np0005548916 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:01 np0005548916 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:02 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 3.1b scrub starts
Dec  6 04:41:02 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 3.1b scrub ok
Dec  6 04:41:02 np0005548916 podman[80301]: 2025-12-06 09:41:02.700428456 +0000 UTC m=+0.088899157 container exec 500f8c89b5c281d0ace139835b07ea5bc6259ce3e028d8284d57da97424b55e0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-crash-compute-1, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, OSD_FLAVOR=default, ceph=True, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  6 04:41:02 np0005548916 ceph-mon[79770]: from='client.? 192.168.122.100:0/2146703949' entity='client.admin' cmd=[{"prefix": "mgr module enable", "module": "dashboard"}]: dispatch
Dec  6 04:41:02 np0005548916 ceph-mgr[80080]: mgr handle_mgr_map respawning because set of enabled modules changed!
Dec  6 04:41:02 np0005548916 ceph-mgr[80080]: mgr respawn  e: '/usr/bin/ceph-mgr'
Dec  6 04:41:02 np0005548916 ceph-mgr[80080]: mgr respawn  0: '/usr/bin/ceph-mgr'
Dec  6 04:41:02 np0005548916 ceph-mgr[80080]: mgr respawn  1: '-n'
Dec  6 04:41:02 np0005548916 ceph-mgr[80080]: mgr respawn  2: 'mgr.compute-1.sauzid'
Dec  6 04:41:02 np0005548916 ceph-mgr[80080]: mgr respawn  3: '-f'
Dec  6 04:41:02 np0005548916 ceph-mgr[80080]: mgr respawn  4: '--setuser'
Dec  6 04:41:02 np0005548916 ceph-mgr[80080]: mgr respawn  5: 'ceph'
Dec  6 04:41:02 np0005548916 ceph-mgr[80080]: mgr respawn  6: '--setgroup'
Dec  6 04:41:02 np0005548916 ceph-mgr[80080]: mgr respawn  7: 'ceph'
Dec  6 04:41:02 np0005548916 ceph-mgr[80080]: mgr respawn  8: '--default-log-to-file=false'
Dec  6 04:41:02 np0005548916 ceph-mgr[80080]: mgr respawn  9: '--default-log-to-journald=true'
Dec  6 04:41:02 np0005548916 ceph-mgr[80080]: mgr respawn  10: '--default-log-to-stderr=false'
Dec  6 04:41:02 np0005548916 ceph-mgr[80080]: mgr respawn respawning with exe /usr/bin/ceph-mgr
Dec  6 04:41:02 np0005548916 ceph-mgr[80080]: mgr respawn  exe_path /proc/self/exe
Dec  6 04:41:02 np0005548916 podman[80301]: 2025-12-06 09:41:02.80675084 +0000 UTC m=+0.195221571 container exec_died 500f8c89b5c281d0ace139835b07ea5bc6259ce3e028d8284d57da97424b55e0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-crash-compute-1, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.build-date=20250325, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2)
Dec  6 04:41:02 np0005548916 systemd[1]: session-22.scope: Deactivated successfully.
Dec  6 04:41:02 np0005548916 systemd-logind[788]: Session 22 logged out. Waiting for processes to exit.
Dec  6 04:41:02 np0005548916 systemd[1]: session-28.scope: Deactivated successfully.
Dec  6 04:41:02 np0005548916 systemd-logind[788]: Session 28 logged out. Waiting for processes to exit.
Dec  6 04:41:02 np0005548916 systemd-logind[788]: Removed session 22.
Dec  6 04:41:02 np0005548916 systemd-logind[788]: Session 32 logged out. Waiting for processes to exit.
Dec  6 04:41:02 np0005548916 systemd-logind[788]: Removed session 28.
Dec  6 04:41:02 np0005548916 systemd-logind[788]: Session 30 logged out. Waiting for processes to exit.
Dec  6 04:41:02 np0005548916 systemd-logind[788]: Session 26 logged out. Waiting for processes to exit.
Dec  6 04:41:02 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e31 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  6 04:41:02 np0005548916 systemd[1]: session-30.scope: Deactivated successfully.
Dec  6 04:41:02 np0005548916 systemd[1]: session-26.scope: Deactivated successfully.
Dec  6 04:41:02 np0005548916 systemd-logind[788]: Removed session 30.
Dec  6 04:41:02 np0005548916 systemd-logind[788]: Removed session 26.
Dec  6 04:41:02 np0005548916 systemd[1]: session-23.scope: Deactivated successfully.
Dec  6 04:41:02 np0005548916 systemd[1]: session-25.scope: Deactivated successfully.
Dec  6 04:41:02 np0005548916 systemd[1]: session-20.scope: Deactivated successfully.
Dec  6 04:41:02 np0005548916 systemd-logind[788]: Session 25 logged out. Waiting for processes to exit.
Dec  6 04:41:02 np0005548916 systemd[1]: session-27.scope: Deactivated successfully.
Dec  6 04:41:02 np0005548916 systemd-logind[788]: Session 23 logged out. Waiting for processes to exit.
Dec  6 04:41:02 np0005548916 systemd[1]: session-31.scope: Deactivated successfully.
Dec  6 04:41:02 np0005548916 systemd[1]: session-24.scope: Deactivated successfully.
Dec  6 04:41:02 np0005548916 systemd-logind[788]: Session 20 logged out. Waiting for processes to exit.
Dec  6 04:41:02 np0005548916 systemd[1]: session-29.scope: Deactivated successfully.
Dec  6 04:41:02 np0005548916 systemd-logind[788]: Session 27 logged out. Waiting for processes to exit.
Dec  6 04:41:02 np0005548916 systemd-logind[788]: Session 24 logged out. Waiting for processes to exit.
Dec  6 04:41:02 np0005548916 systemd-logind[788]: Session 31 logged out. Waiting for processes to exit.
Dec  6 04:41:02 np0005548916 systemd-logind[788]: Session 29 logged out. Waiting for processes to exit.
Dec  6 04:41:02 np0005548916 systemd-logind[788]: Removed session 23.
Dec  6 04:41:02 np0005548916 systemd-logind[788]: Removed session 25.
Dec  6 04:41:02 np0005548916 systemd-logind[788]: Removed session 20.
Dec  6 04:41:02 np0005548916 systemd-logind[788]: Removed session 27.
Dec  6 04:41:02 np0005548916 systemd-logind[788]: Removed session 31.
Dec  6 04:41:02 np0005548916 systemd-logind[788]: Removed session 24.
Dec  6 04:41:02 np0005548916 systemd-logind[788]: Removed session 29.
Dec  6 04:41:02 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: ignoring --setuser ceph since I am not root
Dec  6 04:41:02 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: ignoring --setgroup ceph since I am not root
Dec  6 04:41:02 np0005548916 ceph-mgr[80080]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mgr, pid 2
Dec  6 04:41:02 np0005548916 ceph-mgr[80080]: pidfile_write: ignore empty --pid-file
Dec  6 04:41:02 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'alerts'
Dec  6 04:41:03 np0005548916 ceph-mgr[80080]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Dec  6 04:41:03 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'balancer'
Dec  6 04:41:03 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:03.032+0000 7f880081b140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Dec  6 04:41:03 np0005548916 ceph-mgr[80080]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Dec  6 04:41:03 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:03.117+0000 7f880081b140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Dec  6 04:41:03 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'cephadm'
Dec  6 04:41:03 np0005548916 systemd[1]: session-32.scope: Deactivated successfully.
Dec  6 04:41:03 np0005548916 systemd[1]: session-32.scope: Consumed 1min 26.086s CPU time.
Dec  6 04:41:03 np0005548916 systemd-logind[788]: Removed session 32.
Dec  6 04:41:03 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 4.19 deep-scrub starts
Dec  6 04:41:03 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 4.19 deep-scrub ok
Dec  6 04:41:03 np0005548916 ceph-mon[79770]: from='client.? 192.168.122.100:0/2146703949' entity='client.admin' cmd='[{"prefix": "mgr module enable", "module": "dashboard"}]': finished
Dec  6 04:41:03 np0005548916 ceph-mon[79770]: from='osd.2 ' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch
Dec  6 04:41:03 np0005548916 ceph-mon[79770]: from='osd.2 [v2:192.168.122.102:6800/709563040,v1:192.168.122.102:6801/709563040]' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch
Dec  6 04:41:03 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e32 e32: 3 total, 2 up, 3 in
Dec  6 04:41:03 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'crash'
Dec  6 04:41:04 np0005548916 ceph-mgr[80080]: mgr[py] Module crash has missing NOTIFY_TYPES member
Dec  6 04:41:04 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'dashboard'
Dec  6 04:41:04 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:04.013+0000 7f880081b140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Dec  6 04:41:04 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 3.1f scrub starts
Dec  6 04:41:04 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 3.1f scrub ok
Dec  6 04:41:04 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'devicehealth'
Dec  6 04:41:04 np0005548916 ceph-mgr[80080]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Dec  6 04:41:04 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:04.731+0000 7f880081b140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Dec  6 04:41:04 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'diskprediction_local'
Dec  6 04:41:04 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e33 e33: 3 total, 2 up, 3 in
Dec  6 04:41:04 np0005548916 ceph-mon[79770]: from='osd.2 ' entity='osd.2' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]': finished
Dec  6 04:41:04 np0005548916 ceph-mon[79770]: from='osd.2 ' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=compute-2", "root=default"]}]: dispatch
Dec  6 04:41:04 np0005548916 ceph-mon[79770]: from='osd.2 [v2:192.168.122.102:6800/709563040,v1:192.168.122.102:6801/709563040]' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=compute-2", "root=default"]}]: dispatch
Dec  6 04:41:04 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Dec  6 04:41:04 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Dec  6 04:41:04 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]:  from numpy import show_config as show_numpy_config
Dec  6 04:41:04 np0005548916 ceph-mgr[80080]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Dec  6 04:41:04 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:04.919+0000 7f880081b140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Dec  6 04:41:04 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'influx'
Dec  6 04:41:04 np0005548916 ceph-mgr[80080]: mgr[py] Module influx has missing NOTIFY_TYPES member
Dec  6 04:41:04 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:04.996+0000 7f880081b140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Dec  6 04:41:04 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'insights'
Dec  6 04:41:05 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'iostat'
Dec  6 04:41:05 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 33 pg[5.13( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=33 pruub=10.176478386s) [] r=-1 lpr=33 pi=[27,33)/1 crt=0'0 mlcod 0'0 active pruub 99.331306458s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:41:05 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 33 pg[5.13( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=33 pruub=10.176478386s) [] r=-1 lpr=33 pi=[27,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 99.331306458s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:41:05 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 33 pg[2.15( empty local-lis/les=29/30 n=0 ec=24/13 lis/c=29/29 les/c/f=30/30/0 sis=33 pruub=10.811074257s) [] r=-1 lpr=33 pi=[29,33)/1 crt=0'0 mlcod 0'0 active pruub 99.966171265s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:41:05 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 33 pg[2.15( empty local-lis/les=29/30 n=0 ec=24/13 lis/c=29/29 les/c/f=30/30/0 sis=33 pruub=10.811074257s) [] r=-1 lpr=33 pi=[29,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 99.966171265s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:41:05 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 33 pg[5.12( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=33 pruub=10.176123619s) [] r=-1 lpr=33 pi=[27,33)/1 crt=0'0 mlcod 0'0 active pruub 99.331329346s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:41:05 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 33 pg[5.12( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=33 pruub=10.176123619s) [] r=-1 lpr=33 pi=[27,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 99.331329346s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:41:05 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 33 pg[4.14( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=33 pruub=15.052652359s) [] r=-1 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 active pruub 104.207954407s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:41:05 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 33 pg[4.14( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=33 pruub=15.052652359s) [] r=-1 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 104.207954407s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:41:05 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 33 pg[2.13( empty local-lis/les=29/30 n=0 ec=24/13 lis/c=29/29 les/c/f=30/30/0 sis=33 pruub=10.810589790s) [] r=-1 lpr=33 pi=[29,33)/1 crt=0'0 mlcod 0'0 active pruub 99.965995789s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:41:05 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 33 pg[2.13( empty local-lis/les=29/30 n=0 ec=24/13 lis/c=29/29 les/c/f=30/30/0 sis=33 pruub=10.810589790s) [] r=-1 lpr=33 pi=[29,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 99.965995789s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:41:05 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 33 pg[2.10( empty local-lis/les=29/30 n=0 ec=24/13 lis/c=29/29 les/c/f=30/30/0 sis=33 pruub=10.810255051s) [] r=-1 lpr=33 pi=[29,33)/1 crt=0'0 mlcod 0'0 active pruub 99.966018677s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:41:05 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 33 pg[2.10( empty local-lis/les=29/30 n=0 ec=24/13 lis/c=29/29 les/c/f=30/30/0 sis=33 pruub=10.810255051s) [] r=-1 lpr=33 pi=[29,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 99.966018677s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:41:05 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 33 pg[5.8( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=33 pruub=10.175810814s) [] r=-1 lpr=33 pi=[27,33)/1 crt=0'0 mlcod 0'0 active pruub 99.331695557s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:41:05 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 33 pg[2.c( empty local-lis/les=29/30 n=0 ec=24/13 lis/c=29/29 les/c/f=30/30/0 sis=33 pruub=10.810143471s) [] r=-1 lpr=33 pi=[29,33)/1 crt=0'0 mlcod 0'0 active pruub 99.966064453s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:41:05 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 33 pg[2.c( empty local-lis/les=29/30 n=0 ec=24/13 lis/c=29/29 les/c/f=30/30/0 sis=33 pruub=10.810143471s) [] r=-1 lpr=33 pi=[29,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 99.966064453s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:41:05 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 33 pg[5.8( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=33 pruub=10.175810814s) [] r=-1 lpr=33 pi=[27,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 99.331695557s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:41:05 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 33 pg[5.b( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=33 pruub=10.175298691s) [] r=-1 lpr=33 pi=[27,33)/1 crt=0'0 mlcod 0'0 active pruub 99.331344604s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:41:05 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 33 pg[5.b( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=33 pruub=10.175298691s) [] r=-1 lpr=33 pi=[27,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 99.331344604s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:41:05 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 33 pg[2.d( empty local-lis/les=29/30 n=0 ec=24/13 lis/c=29/29 les/c/f=30/30/0 sis=33 pruub=10.809710503s) [] r=-1 lpr=33 pi=[29,33)/1 crt=0'0 mlcod 0'0 active pruub 99.965881348s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:41:05 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 33 pg[2.a( empty local-lis/les=29/30 n=0 ec=24/13 lis/c=29/29 les/c/f=30/30/0 sis=33 pruub=10.809786797s) [] r=-1 lpr=33 pi=[29,33)/1 crt=0'0 mlcod 0'0 active pruub 99.965988159s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:41:05 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 33 pg[2.a( empty local-lis/les=29/30 n=0 ec=24/13 lis/c=29/29 les/c/f=30/30/0 sis=33 pruub=10.809786797s) [] r=-1 lpr=33 pi=[29,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 99.965988159s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:41:05 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 33 pg[2.d( empty local-lis/les=29/30 n=0 ec=24/13 lis/c=29/29 les/c/f=30/30/0 sis=33 pruub=10.809710503s) [] r=-1 lpr=33 pi=[29,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 99.965881348s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:41:05 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 33 pg[5.d( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=33 pruub=10.175025940s) [] r=-1 lpr=33 pi=[27,33)/1 crt=0'0 mlcod 0'0 active pruub 99.331336975s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:41:05 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 33 pg[5.d( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=33 pruub=10.175025940s) [] r=-1 lpr=33 pi=[27,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 99.331336975s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:41:05 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 33 pg[3.0( empty local-lis/les=24/25 n=0 ec=15/15 lis/c=24/24 les/c/f=25/25/0 sis=33 pruub=14.045259476s) [] r=-1 lpr=33 pi=[24,33)/1 crt=0'0 mlcod 0'0 active pruub 103.201667786s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:41:05 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 33 pg[3.0( empty local-lis/les=24/25 n=0 ec=15/15 lis/c=24/24 les/c/f=25/25/0 sis=33 pruub=14.045259476s) [] r=-1 lpr=33 pi=[24,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 103.201667786s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:41:05 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 33 pg[5.0( empty local-lis/les=27/28 n=0 ec=19/19 lis/c=27/27 les/c/f=28/28/0 sis=33 pruub=10.175087929s) [] r=-1 lpr=33 pi=[27,33)/1 crt=0'0 mlcod 0'0 active pruub 99.331726074s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:41:05 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 33 pg[5.0( empty local-lis/les=27/28 n=0 ec=19/19 lis/c=27/27 les/c/f=28/28/0 sis=33 pruub=10.175087929s) [] r=-1 lpr=33 pi=[27,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 99.331726074s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:41:05 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 33 pg[4.2( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=33 pruub=15.051401138s) [] r=-1 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 active pruub 104.208145142s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:41:05 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 33 pg[4.2( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=33 pruub=15.051401138s) [] r=-1 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 104.208145142s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:41:05 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 33 pg[4.6( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=33 pruub=15.051333427s) [] r=-1 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 active pruub 104.208114624s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:41:05 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 33 pg[4.6( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=33 pruub=15.051333427s) [] r=-1 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 104.208114624s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:41:05 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 33 pg[4.3( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=33 pruub=15.051722527s) [] r=-1 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 active pruub 104.208694458s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:41:05 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 33 pg[4.3( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=33 pruub=15.051722527s) [] r=-1 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 104.208694458s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:41:05 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 33 pg[3.8( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=33 pruub=14.051517487s) [] r=-1 lpr=33 pi=[24,33)/1 crt=0'0 mlcod 0'0 active pruub 103.208557129s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:41:05 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 33 pg[3.8( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=33 pruub=14.051517487s) [] r=-1 lpr=33 pi=[24,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 103.208557129s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:41:05 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 33 pg[4.1d( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=33 pruub=15.051094055s) [] r=-1 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 active pruub 104.208251953s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:41:05 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 33 pg[4.1c( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=33 pruub=15.053232193s) [] r=-1 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 active pruub 104.210411072s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:41:05 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 33 pg[4.1d( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=33 pruub=15.051094055s) [] r=-1 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 104.208251953s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:41:05 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 33 pg[4.1c( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=33 pruub=15.053232193s) [] r=-1 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 104.210411072s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:41:05 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 33 pg[3.1b( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=33 pruub=14.051561356s) [] r=-1 lpr=33 pi=[24,33)/1 crt=0'0 mlcod 0'0 active pruub 103.208839417s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:41:05 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 33 pg[3.1b( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=33 pruub=14.051561356s) [] r=-1 lpr=33 pi=[24,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 103.208839417s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:41:05 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 33 pg[4.19( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=33 pruub=15.053050041s) [] r=-1 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 active pruub 104.210380554s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:41:05 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 33 pg[4.19( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=33 pruub=15.053050041s) [] r=-1 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 104.210380554s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:41:05 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 33 pg[2.1b( empty local-lis/les=29/30 n=0 ec=24/13 lis/c=29/29 les/c/f=30/30/0 sis=33 pruub=10.808290482s) [] r=-1 lpr=33 pi=[29,33)/1 crt=0'0 mlcod 0'0 active pruub 99.965744019s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:41:05 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 33 pg[2.1b( empty local-lis/les=29/30 n=0 ec=24/13 lis/c=29/29 les/c/f=30/30/0 sis=33 pruub=10.808290482s) [] r=-1 lpr=33 pi=[29,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 99.965744019s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:41:05 np0005548916 ceph-mgr[80080]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Dec  6 04:41:05 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:05.132+0000 7f880081b140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Dec  6 04:41:05 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'k8sevents'
Dec  6 04:41:05 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 3.1e deep-scrub starts
Dec  6 04:41:05 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 3.1e deep-scrub ok
Dec  6 04:41:05 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'localpool'
Dec  6 04:41:05 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'mds_autoscaler'
Dec  6 04:41:05 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'mirroring'
Dec  6 04:41:05 np0005548916 ceph-mon[79770]: from='osd.2 ' entity='osd.2' cmd='[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=compute-2", "root=default"]}]': finished
Dec  6 04:41:05 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'nfs'
Dec  6 04:41:06 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 4.f scrub starts
Dec  6 04:41:06 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:06.271+0000 7f880081b140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Dec  6 04:41:06 np0005548916 ceph-mgr[80080]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Dec  6 04:41:06 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'orchestrator'
Dec  6 04:41:06 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 4.f scrub ok
Dec  6 04:41:06 np0005548916 ceph-mgr[80080]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Dec  6 04:41:06 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:06.505+0000 7f880081b140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Dec  6 04:41:06 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'osd_perf_query'
Dec  6 04:41:06 np0005548916 ceph-mgr[80080]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Dec  6 04:41:06 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:06.589+0000 7f880081b140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Dec  6 04:41:06 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'osd_support'
Dec  6 04:41:06 np0005548916 ceph-mgr[80080]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Dec  6 04:41:06 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:06.662+0000 7f880081b140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Dec  6 04:41:06 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'pg_autoscaler'
Dec  6 04:41:06 np0005548916 ceph-mgr[80080]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Dec  6 04:41:06 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:06.753+0000 7f880081b140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Dec  6 04:41:06 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'progress'
Dec  6 04:41:06 np0005548916 ceph-mgr[80080]: mgr[py] Module progress has missing NOTIFY_TYPES member
Dec  6 04:41:06 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'prometheus'
Dec  6 04:41:06 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:06.850+0000 7f880081b140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Dec  6 04:41:07 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 5.19 scrub starts
Dec  6 04:41:07 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 5.19 scrub ok
Dec  6 04:41:07 np0005548916 ceph-mgr[80080]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Dec  6 04:41:07 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:07.264+0000 7f880081b140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Dec  6 04:41:07 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'rbd_support'
Dec  6 04:41:07 np0005548916 ceph-mgr[80080]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Dec  6 04:41:07 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:07.361+0000 7f880081b140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Dec  6 04:41:07 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'restful'
Dec  6 04:41:07 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'rgw'
Dec  6 04:41:07 np0005548916 ceph-mgr[80080]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Dec  6 04:41:07 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:07.853+0000 7f880081b140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Dec  6 04:41:07 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'rook'
Dec  6 04:41:07 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e33 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  6 04:41:08 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 2.19 scrub starts
Dec  6 04:41:08 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 2.19 scrub ok
Dec  6 04:41:08 np0005548916 ceph-mgr[80080]: mgr[py] Module rook has missing NOTIFY_TYPES member
Dec  6 04:41:08 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:08.517+0000 7f880081b140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Dec  6 04:41:08 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'selftest'
Dec  6 04:41:08 np0005548916 ceph-mgr[80080]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Dec  6 04:41:08 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:08.597+0000 7f880081b140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Dec  6 04:41:08 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'snap_schedule'
Dec  6 04:41:08 np0005548916 ceph-mgr[80080]: mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Dec  6 04:41:08 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:08.697+0000 7f880081b140 -1 mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Dec  6 04:41:08 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'stats'
Dec  6 04:41:08 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'status'
Dec  6 04:41:08 np0005548916 ceph-mgr[80080]: mgr[py] Module status has missing NOTIFY_TYPES member
Dec  6 04:41:08 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:08.872+0000 7f880081b140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Dec  6 04:41:08 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'telegraf'
Dec  6 04:41:08 np0005548916 ceph-mgr[80080]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Dec  6 04:41:08 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:08.971+0000 7f880081b140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Dec  6 04:41:08 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'telemetry'
Dec  6 04:41:09 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e34 e34: 3 total, 2 up, 3 in
Dec  6 04:41:09 np0005548916 ceph-mgr[80080]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Dec  6 04:41:09 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:09.343+0000 7f880081b140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Dec  6 04:41:09 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'test_orchestrator'
Dec  6 04:41:09 np0005548916 ceph-mgr[80080]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Dec  6 04:41:09 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:09.677+0000 7f880081b140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Dec  6 04:41:09 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'volumes'
Dec  6 04:41:09 np0005548916 systemd-logind[788]: New session 33 of user ceph-admin.
Dec  6 04:41:09 np0005548916 systemd[1]: Started Session 33 of User ceph-admin.
Dec  6 04:41:10 np0005548916 ceph-mon[79770]: Active manager daemon compute-0.qhdjwa restarted
Dec  6 04:41:10 np0005548916 ceph-mon[79770]: Activating manager daemon compute-0.qhdjwa
Dec  6 04:41:10 np0005548916 ceph-mon[79770]: Manager daemon compute-0.qhdjwa is now available
Dec  6 04:41:10 np0005548916 ceph-mon[79770]: from='mgr.24116 192.168.122.100:0/4088948354' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.qhdjwa/mirror_snapshot_schedule"}]: dispatch
Dec  6 04:41:10 np0005548916 ceph-mgr[80080]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Dec  6 04:41:10 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'zabbix'
Dec  6 04:41:10 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:10.056+0000 7f880081b140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Dec  6 04:41:10 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:10.176+0000 7f880081b140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Dec  6 04:41:10 np0005548916 ceph-mgr[80080]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Dec  6 04:41:10 np0005548916 ceph-mgr[80080]: [dashboard DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec  6 04:41:10 np0005548916 ceph-mgr[80080]: mgr load Constructed class from module: dashboard
Dec  6 04:41:10 np0005548916 ceph-mgr[80080]: [dashboard INFO root] server: ssl=no host=192.168.122.101 port=8443
Dec  6 04:41:10 np0005548916 ceph-mgr[80080]: [dashboard INFO root] Configured CherryPy, starting engine...
Dec  6 04:41:10 np0005548916 ceph-mgr[80080]: [dashboard INFO root] Starting engine...
Dec  6 04:41:10 np0005548916 ceph-mgr[80080]: ms_deliver_dispatch: unhandled message 0x55cc3d915860 mon_map magic: 0 from mon.2 v2:192.168.122.101:3300/0
Dec  6 04:41:10 np0005548916 ceph-mgr[80080]: [dashboard INFO root] Engine started...
Dec  6 04:41:11 np0005548916 podman[80554]: 2025-12-06 09:41:11.109317284 +0000 UTC m=+0.090581227 container exec 500f8c89b5c281d0ace139835b07ea5bc6259ce3e028d8284d57da97424b55e0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-crash-compute-1, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_REF=squid, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  6 04:41:11 np0005548916 podman[80554]: 2025-12-06 09:41:11.242774691 +0000 UTC m=+0.224038594 container exec_died 500f8c89b5c281d0ace139835b07ea5bc6259ce3e028d8284d57da97424b55e0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-crash-compute-1, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325)
Dec  6 04:41:11 np0005548916 ceph-mon[79770]: OSD bench result of 3012.211775 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.2. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Dec  6 04:41:11 np0005548916 ceph-mon[79770]: from='mgr.24116 192.168.122.100:0/4088948354' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:11 np0005548916 ceph-mon[79770]: from='mgr.24116 192.168.122.100:0/4088948354' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:11 np0005548916 ceph-mon[79770]: from='mgr.24116 192.168.122.100:0/4088948354' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:11 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e35 e35: 3 total, 3 up, 3 in
Dec  6 04:41:11 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 35 pg[5.13( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=35 pruub=3.997880459s) [2] r=-1 lpr=35 pi=[27,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 99.331306458s@ mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:41:11 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 35 pg[2.15( empty local-lis/les=29/30 n=0 ec=24/13 lis/c=29/29 les/c/f=30/30/0 sis=35 pruub=4.632750034s) [2] r=-1 lpr=35 pi=[29,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 99.966171265s@ mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:41:11 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 35 pg[5.12( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=35 pruub=3.997904301s) [2] r=-1 lpr=35 pi=[27,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 99.331329346s@ mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:41:11 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 35 pg[5.13( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=35 pruub=3.997353077s) [2] r=-1 lpr=35 pi=[27,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 99.331306458s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:41:11 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 35 pg[2.15( empty local-lis/les=29/30 n=0 ec=24/13 lis/c=29/29 les/c/f=30/30/0 sis=35 pruub=4.632209778s) [2] r=-1 lpr=35 pi=[29,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 99.966171265s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:41:11 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 35 pg[5.12( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=35 pruub=3.997376680s) [2] r=-1 lpr=35 pi=[27,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 99.331329346s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:41:11 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 35 pg[5.8( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=35 pruub=3.996918201s) [2] r=-1 lpr=35 pi=[27,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 99.331695557s@ mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:41:11 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 35 pg[5.8( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=35 pruub=3.996894836s) [2] r=-1 lpr=35 pi=[27,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 99.331695557s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:41:11 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 35 pg[2.c( empty local-lis/les=29/30 n=0 ec=24/13 lis/c=29/29 les/c/f=30/30/0 sis=35 pruub=4.631101608s) [2] r=-1 lpr=35 pi=[29,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 99.966064453s@ mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:41:11 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 35 pg[2.10( empty local-lis/les=29/30 n=0 ec=24/13 lis/c=29/29 les/c/f=30/30/0 sis=35 pruub=4.631021500s) [2] r=-1 lpr=35 pi=[29,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 99.966018677s@ mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:41:11 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 35 pg[2.c( empty local-lis/les=29/30 n=0 ec=24/13 lis/c=29/29 les/c/f=30/30/0 sis=35 pruub=4.631088734s) [2] r=-1 lpr=35 pi=[29,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 99.966064453s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:41:11 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 35 pg[2.10( empty local-lis/les=29/30 n=0 ec=24/13 lis/c=29/29 les/c/f=30/30/0 sis=35 pruub=4.630993366s) [2] r=-1 lpr=35 pi=[29,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 99.966018677s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:41:11 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 35 pg[5.b( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=35 pruub=3.996212721s) [2] r=-1 lpr=35 pi=[27,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 99.331344604s@ mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:41:11 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 35 pg[5.b( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=35 pruub=3.996188402s) [2] r=-1 lpr=35 pi=[27,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 99.331344604s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:41:11 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 35 pg[2.d( empty local-lis/les=29/30 n=0 ec=24/13 lis/c=29/29 les/c/f=30/30/0 sis=35 pruub=4.630615711s) [2] r=-1 lpr=35 pi=[29,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 99.965881348s@ mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:41:11 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 35 pg[2.d( empty local-lis/les=29/30 n=0 ec=24/13 lis/c=29/29 les/c/f=30/30/0 sis=35 pruub=4.630573750s) [2] r=-1 lpr=35 pi=[29,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 99.965881348s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:41:11 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 35 pg[2.13( empty local-lis/les=29/30 n=0 ec=24/13 lis/c=29/29 les/c/f=30/30/0 sis=35 pruub=4.630668163s) [2] r=-1 lpr=35 pi=[29,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 99.965995789s@ mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:41:11 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 35 pg[5.d( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=35 pruub=3.995894432s) [2] r=-1 lpr=35 pi=[27,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 99.331336975s@ mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:41:11 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 35 pg[5.d( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=35 pruub=3.995878220s) [2] r=-1 lpr=35 pi=[27,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 99.331336975s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:41:11 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 35 pg[2.a( empty local-lis/les=29/30 n=0 ec=24/13 lis/c=29/29 les/c/f=30/30/0 sis=35 pruub=4.630395412s) [2] r=-1 lpr=35 pi=[29,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 99.965988159s@ mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:41:11 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 35 pg[2.13( empty local-lis/les=29/30 n=0 ec=24/13 lis/c=29/29 les/c/f=30/30/0 sis=35 pruub=4.630543232s) [2] r=-1 lpr=35 pi=[29,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 99.965995789s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:41:11 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 35 pg[3.0( empty local-lis/les=24/25 n=0 ec=15/15 lis/c=24/24 les/c/f=25/25/0 sis=35 pruub=7.865968227s) [2] r=-1 lpr=35 pi=[24,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 103.201667786s@ mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:41:11 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 35 pg[2.a( empty local-lis/les=29/30 n=0 ec=24/13 lis/c=29/29 les/c/f=30/30/0 sis=35 pruub=4.630350113s) [2] r=-1 lpr=35 pi=[29,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 99.965988159s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:41:11 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 35 pg[3.0( empty local-lis/les=24/25 n=0 ec=15/15 lis/c=24/24 les/c/f=25/25/0 sis=35 pruub=7.865921497s) [2] r=-1 lpr=35 pi=[24,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 103.201667786s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:41:11 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 35 pg[5.0( empty local-lis/les=27/28 n=0 ec=19/19 lis/c=27/27 les/c/f=28/28/0 sis=35 pruub=3.995961189s) [2] r=-1 lpr=35 pi=[27,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 99.331726074s@ mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:41:11 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 35 pg[5.0( empty local-lis/les=27/28 n=0 ec=19/19 lis/c=27/27 les/c/f=28/28/0 sis=35 pruub=3.995939732s) [2] r=-1 lpr=35 pi=[27,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 99.331726074s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:41:11 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 35 pg[4.6( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=35 pruub=8.872111320s) [2] r=-1 lpr=35 pi=[25,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 104.208114624s@ mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:41:11 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 35 pg[4.6( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=35 pruub=8.872088432s) [2] r=-1 lpr=35 pi=[25,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 104.208114624s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:41:11 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 35 pg[3.8( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=35 pruub=7.872457027s) [2] r=-1 lpr=35 pi=[24,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 103.208557129s@ mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:41:11 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 35 pg[4.2( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=35 pruub=8.871963501s) [2] r=-1 lpr=35 pi=[25,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 104.208145142s@ mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:41:11 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 35 pg[3.8( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=35 pruub=7.872427464s) [2] r=-1 lpr=35 pi=[24,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 103.208557129s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:41:11 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 35 pg[4.2( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=35 pruub=8.871884346s) [2] r=-1 lpr=35 pi=[25,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 104.208145142s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:41:11 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 35 pg[2.1b( empty local-lis/les=29/30 n=0 ec=24/13 lis/c=29/29 les/c/f=30/30/0 sis=35 pruub=4.629342079s) [2] r=-1 lpr=35 pi=[29,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 99.965744019s@ mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:41:11 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 35 pg[2.1b( empty local-lis/les=29/30 n=0 ec=24/13 lis/c=29/29 les/c/f=30/30/0 sis=35 pruub=4.629287243s) [2] r=-1 lpr=35 pi=[29,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 99.965744019s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:41:11 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 35 pg[4.1d( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=35 pruub=8.871744156s) [2] r=-1 lpr=35 pi=[25,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 104.208251953s@ mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:41:11 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 35 pg[4.3( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=35 pruub=8.871904373s) [2] r=-1 lpr=35 pi=[25,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 104.208694458s@ mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:41:11 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 35 pg[4.3( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=35 pruub=8.871889114s) [2] r=-1 lpr=35 pi=[25,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 104.208694458s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:41:11 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 35 pg[4.1c( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=35 pruub=8.873806000s) [2] r=-1 lpr=35 pi=[25,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 104.210411072s@ mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:41:11 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 35 pg[4.1d( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=35 pruub=8.871715546s) [2] r=-1 lpr=35 pi=[25,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 104.208251953s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:41:11 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 35 pg[4.1c( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=35 pruub=8.873502731s) [2] r=-1 lpr=35 pi=[25,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 104.210411072s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:41:11 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 35 pg[4.19( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=35 pruub=8.873318672s) [2] r=-1 lpr=35 pi=[25,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 104.210380554s@ mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:41:11 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 35 pg[4.19( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=35 pruub=8.873299599s) [2] r=-1 lpr=35 pi=[25,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 104.210380554s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:41:11 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 35 pg[3.1b( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=35 pruub=7.871504784s) [2] r=-1 lpr=35 pi=[24,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 103.208839417s@ mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:41:11 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 35 pg[3.1b( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=35 pruub=7.871484756s) [2] r=-1 lpr=35 pi=[24,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 103.208839417s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:41:11 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 35 pg[4.14( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=35 pruub=8.870471954s) [2] r=-1 lpr=35 pi=[25,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 104.207954407s@ mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:41:11 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 35 pg[4.14( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=35 pruub=8.870453835s) [2] r=-1 lpr=35 pi=[25,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 104.207954407s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:41:12 np0005548916 ceph-mon[79770]: [06/Dec/2025:09:41:11] ENGINE Bus STARTING
Dec  6 04:41:12 np0005548916 ceph-mon[79770]: from='mgr.24116 192.168.122.100:0/4088948354' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:12 np0005548916 ceph-mon[79770]: [06/Dec/2025:09:41:11] ENGINE Serving on https://192.168.122.100:7150
Dec  6 04:41:12 np0005548916 ceph-mon[79770]: [06/Dec/2025:09:41:11] ENGINE Client ('192.168.122.100', 54474) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Dec  6 04:41:12 np0005548916 ceph-mon[79770]: from='mgr.24116 192.168.122.100:0/4088948354' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:12 np0005548916 ceph-mon[79770]: from='mgr.24116 192.168.122.100:0/4088948354' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:12 np0005548916 ceph-mon[79770]: osd.2 [v2:192.168.122.102:6800/709563040,v1:192.168.122.102:6801/709563040] boot
Dec  6 04:41:12 np0005548916 ceph-mon[79770]: [06/Dec/2025:09:41:11] ENGINE Serving on http://192.168.122.100:8765
Dec  6 04:41:12 np0005548916 ceph-mon[79770]: [06/Dec/2025:09:41:11] ENGINE Bus STARTED
Dec  6 04:41:12 np0005548916 ceph-mon[79770]: from='mgr.24116 192.168.122.100:0/4088948354' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:12 np0005548916 ceph-mon[79770]: from='mgr.24116 192.168.122.100:0/4088948354' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:12 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e36 e36: 3 total, 3 up, 3 in
Dec  6 04:41:12 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e36 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  6 04:41:13 np0005548916 ceph-mon[79770]: from='mgr.24116 192.168.122.100:0/4088948354' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:13 np0005548916 ceph-mon[79770]: from='mgr.24116 192.168.122.100:0/4088948354' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.qhdjwa/trash_purge_schedule"}]: dispatch
Dec  6 04:41:13 np0005548916 ceph-mon[79770]: from='mgr.24116 192.168.122.100:0/4088948354' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:13 np0005548916 ceph-mon[79770]: from='mgr.24116 192.168.122.100:0/4088948354' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:13 np0005548916 ceph-mon[79770]: from='mgr.24116 192.168.122.100:0/4088948354' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:13 np0005548916 ceph-mon[79770]: from='mgr.24116 192.168.122.100:0/4088948354' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:13 np0005548916 ceph-mon[79770]: from='mgr.24116 192.168.122.100:0/4088948354' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"}]: dispatch
Dec  6 04:41:13 np0005548916 ceph-mon[79770]: Adjusting osd_memory_target on compute-0 to 128.0M
Dec  6 04:41:13 np0005548916 ceph-mon[79770]: Unable to set osd_memory_target on compute-0 to 134217728: error parsing value: Value '134217728' is below minimum 939524096
Dec  6 04:41:13 np0005548916 ceph-mon[79770]: from='mgr.24116 192.168.122.100:0/4088948354' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:13 np0005548916 ceph-mon[79770]: from='mgr.24116 192.168.122.100:0/4088948354' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:13 np0005548916 ceph-mon[79770]: from='mgr.24116 192.168.122.100:0/4088948354' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"}]: dispatch
Dec  6 04:41:13 np0005548916 ceph-mon[79770]: from='mgr.24116 192.168.122.100:0/4088948354' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:14 np0005548916 ceph-mon[79770]: Adjusting osd_memory_target on compute-1 to 127.9M
Dec  6 04:41:14 np0005548916 ceph-mon[79770]: Unable to set osd_memory_target on compute-1 to 134211993: error parsing value: Value '134211993' is below minimum 939524096
Dec  6 04:41:14 np0005548916 ceph-mon[79770]: from='mgr.24116 192.168.122.100:0/4088948354' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:15 np0005548916 ceph-mon[79770]: from='mgr.24116 192.168.122.100:0/4088948354' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:15 np0005548916 ceph-mon[79770]: from='mgr.24116 192.168.122.100:0/4088948354' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"}]: dispatch
Dec  6 04:41:15 np0005548916 ceph-mon[79770]: Adjusting osd_memory_target on compute-2 to 127.9M
Dec  6 04:41:15 np0005548916 ceph-mon[79770]: Unable to set osd_memory_target on compute-2 to 134214860: error parsing value: Value '134214860' is below minimum 939524096
Dec  6 04:41:15 np0005548916 ceph-mon[79770]: from='mgr.24116 192.168.122.100:0/4088948354' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  6 04:41:15 np0005548916 ceph-mon[79770]: Updating compute-0:/etc/ceph/ceph.conf
Dec  6 04:41:15 np0005548916 ceph-mon[79770]: Updating compute-1:/etc/ceph/ceph.conf
Dec  6 04:41:15 np0005548916 ceph-mon[79770]: Updating compute-2:/etc/ceph/ceph.conf
Dec  6 04:41:15 np0005548916 ceph-mon[79770]: from='mgr.24116 192.168.122.100:0/4088948354' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:16 np0005548916 ceph-mon[79770]: Updating compute-0:/var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/config/ceph.conf
Dec  6 04:41:16 np0005548916 ceph-mon[79770]: Updating compute-1:/var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/config/ceph.conf
Dec  6 04:41:16 np0005548916 ceph-mon[79770]: Updating compute-2:/var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/config/ceph.conf
Dec  6 04:41:16 np0005548916 ceph-mon[79770]: from='client.? 192.168.122.100:0/986641805' entity='client.admin' cmd=[{"prefix": "mgr module disable", "module": "dashboard"}]: dispatch
Dec  6 04:41:16 np0005548916 ceph-mon[79770]: Updating compute-1:/etc/ceph/ceph.client.admin.keyring
Dec  6 04:41:16 np0005548916 ceph-mon[79770]: Updating compute-2:/etc/ceph/ceph.client.admin.keyring
Dec  6 04:41:16 np0005548916 ceph-mon[79770]: Updating compute-0:/etc/ceph/ceph.client.admin.keyring
Dec  6 04:41:16 np0005548916 ceph-mgr[80080]: mgr handle_mgr_map respawning because set of enabled modules changed!
Dec  6 04:41:16 np0005548916 ceph-mgr[80080]: mgr respawn  e: '/usr/bin/ceph-mgr'
Dec  6 04:41:16 np0005548916 ceph-mgr[80080]: mgr respawn  0: '/usr/bin/ceph-mgr'
Dec  6 04:41:16 np0005548916 ceph-mgr[80080]: mgr respawn  1: '-n'
Dec  6 04:41:16 np0005548916 ceph-mgr[80080]: mgr respawn  2: 'mgr.compute-1.sauzid'
Dec  6 04:41:16 np0005548916 ceph-mgr[80080]: mgr respawn  3: '-f'
Dec  6 04:41:16 np0005548916 ceph-mgr[80080]: mgr respawn  4: '--setuser'
Dec  6 04:41:16 np0005548916 ceph-mgr[80080]: mgr respawn  5: 'ceph'
Dec  6 04:41:16 np0005548916 ceph-mgr[80080]: mgr respawn  6: '--setgroup'
Dec  6 04:41:16 np0005548916 ceph-mgr[80080]: mgr respawn  7: 'ceph'
Dec  6 04:41:16 np0005548916 ceph-mgr[80080]: mgr respawn  8: '--default-log-to-file=false'
Dec  6 04:41:16 np0005548916 ceph-mgr[80080]: mgr respawn  9: '--default-log-to-journald=true'
Dec  6 04:41:16 np0005548916 ceph-mgr[80080]: mgr respawn  10: '--default-log-to-stderr=false'
Dec  6 04:41:16 np0005548916 ceph-mgr[80080]: mgr respawn respawning with exe /usr/bin/ceph-mgr
Dec  6 04:41:16 np0005548916 ceph-mgr[80080]: mgr respawn  exe_path /proc/self/exe
Dec  6 04:41:16 np0005548916 systemd[1]: session-33.scope: Deactivated successfully.
Dec  6 04:41:16 np0005548916 systemd[1]: session-33.scope: Consumed 5.190s CPU time.
Dec  6 04:41:16 np0005548916 systemd-logind[788]: Session 33 logged out. Waiting for processes to exit.
Dec  6 04:41:16 np0005548916 systemd-logind[788]: Removed session 33.
Dec  6 04:41:16 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: ignoring --setuser ceph since I am not root
Dec  6 04:41:16 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: ignoring --setgroup ceph since I am not root
Dec  6 04:41:16 np0005548916 ceph-mgr[80080]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mgr, pid 2
Dec  6 04:41:16 np0005548916 ceph-mgr[80080]: pidfile_write: ignore empty --pid-file
Dec  6 04:41:16 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'alerts'
Dec  6 04:41:16 np0005548916 ceph-mgr[80080]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Dec  6 04:41:16 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:16.710+0000 7fecc40b7140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Dec  6 04:41:16 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'balancer'
Dec  6 04:41:16 np0005548916 ceph-mgr[80080]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Dec  6 04:41:16 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:16.789+0000 7fecc40b7140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Dec  6 04:41:16 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'cephadm'
Dec  6 04:41:17 np0005548916 ceph-mon[79770]: from='client.? 192.168.122.100:0/986641805' entity='client.admin' cmd='[{"prefix": "mgr module disable", "module": "dashboard"}]': finished
Dec  6 04:41:17 np0005548916 ceph-mon[79770]: from='client.? 192.168.122.100:0/2772325777' entity='client.admin' cmd=[{"prefix": "mgr module enable", "module": "dashboard"}]: dispatch
Dec  6 04:41:17 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'crash'
Dec  6 04:41:17 np0005548916 ceph-mgr[80080]: mgr[py] Module crash has missing NOTIFY_TYPES member
Dec  6 04:41:17 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:17.713+0000 7fecc40b7140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Dec  6 04:41:17 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'dashboard'
Dec  6 04:41:17 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e36 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  6 04:41:18 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'devicehealth'
Dec  6 04:41:18 np0005548916 ceph-mon[79770]: from='client.? 192.168.122.100:0/2772325777' entity='client.admin' cmd='[{"prefix": "mgr module enable", "module": "dashboard"}]': finished
Dec  6 04:41:18 np0005548916 ceph-mgr[80080]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Dec  6 04:41:18 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'diskprediction_local'
Dec  6 04:41:18 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:18.413+0000 7fecc40b7140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Dec  6 04:41:18 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Dec  6 04:41:18 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Dec  6 04:41:18 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]:  from numpy import show_config as show_numpy_config
Dec  6 04:41:18 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:18.600+0000 7fecc40b7140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Dec  6 04:41:18 np0005548916 ceph-mgr[80080]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Dec  6 04:41:18 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'influx'
Dec  6 04:41:18 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:18.676+0000 7fecc40b7140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Dec  6 04:41:18 np0005548916 ceph-mgr[80080]: mgr[py] Module influx has missing NOTIFY_TYPES member
Dec  6 04:41:18 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'insights'
Dec  6 04:41:18 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'iostat'
Dec  6 04:41:18 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:18.866+0000 7fecc40b7140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Dec  6 04:41:18 np0005548916 ceph-mgr[80080]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Dec  6 04:41:18 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'k8sevents'
Dec  6 04:41:19 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'localpool'
Dec  6 04:41:19 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'mds_autoscaler'
Dec  6 04:41:19 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'mirroring'
Dec  6 04:41:19 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'nfs'
Dec  6 04:41:20 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:20.006+0000 7fecc40b7140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Dec  6 04:41:20 np0005548916 ceph-mgr[80080]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Dec  6 04:41:20 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'orchestrator'
Dec  6 04:41:20 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:20.248+0000 7fecc40b7140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Dec  6 04:41:20 np0005548916 ceph-mgr[80080]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Dec  6 04:41:20 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'osd_perf_query'
Dec  6 04:41:20 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:20.334+0000 7fecc40b7140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Dec  6 04:41:20 np0005548916 ceph-mgr[80080]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Dec  6 04:41:20 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'osd_support'
Dec  6 04:41:20 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:20.404+0000 7fecc40b7140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Dec  6 04:41:20 np0005548916 ceph-mgr[80080]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Dec  6 04:41:20 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'pg_autoscaler'
Dec  6 04:41:20 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:20.485+0000 7fecc40b7140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Dec  6 04:41:20 np0005548916 ceph-mgr[80080]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Dec  6 04:41:20 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'progress'
Dec  6 04:41:20 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:20.570+0000 7fecc40b7140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Dec  6 04:41:20 np0005548916 ceph-mgr[80080]: mgr[py] Module progress has missing NOTIFY_TYPES member
Dec  6 04:41:20 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'prometheus'
Dec  6 04:41:20 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:20.927+0000 7fecc40b7140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Dec  6 04:41:20 np0005548916 ceph-mgr[80080]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Dec  6 04:41:20 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'rbd_support'
Dec  6 04:41:21 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:21.031+0000 7fecc40b7140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Dec  6 04:41:21 np0005548916 ceph-mgr[80080]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Dec  6 04:41:21 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'restful'
Dec  6 04:41:21 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'rgw'
Dec  6 04:41:21 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:21.500+0000 7fecc40b7140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Dec  6 04:41:21 np0005548916 ceph-mgr[80080]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Dec  6 04:41:21 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'rook'
Dec  6 04:41:22 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:22.130+0000 7fecc40b7140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Dec  6 04:41:22 np0005548916 ceph-mgr[80080]: mgr[py] Module rook has missing NOTIFY_TYPES member
Dec  6 04:41:22 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'selftest'
Dec  6 04:41:22 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:22.202+0000 7fecc40b7140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Dec  6 04:41:22 np0005548916 ceph-mgr[80080]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Dec  6 04:41:22 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'snap_schedule'
Dec  6 04:41:22 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:22.297+0000 7fecc40b7140 -1 mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Dec  6 04:41:22 np0005548916 ceph-mgr[80080]: mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Dec  6 04:41:22 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'stats'
Dec  6 04:41:22 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'status'
Dec  6 04:41:22 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:22.469+0000 7fecc40b7140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Dec  6 04:41:22 np0005548916 ceph-mgr[80080]: mgr[py] Module status has missing NOTIFY_TYPES member
Dec  6 04:41:22 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'telegraf'
Dec  6 04:41:22 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:22.546+0000 7fecc40b7140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Dec  6 04:41:22 np0005548916 ceph-mgr[80080]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Dec  6 04:41:22 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'telemetry'
Dec  6 04:41:22 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e37 e37: 3 total, 3 up, 3 in
Dec  6 04:41:22 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:22.703+0000 7fecc40b7140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Dec  6 04:41:22 np0005548916 ceph-mgr[80080]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Dec  6 04:41:22 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'test_orchestrator'
Dec  6 04:41:22 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e37 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  6 04:41:22 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:22.949+0000 7fecc40b7140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Dec  6 04:41:22 np0005548916 ceph-mgr[80080]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Dec  6 04:41:22 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'volumes'
Dec  6 04:41:23 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:23.241+0000 7fecc40b7140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Dec  6 04:41:23 np0005548916 ceph-mgr[80080]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Dec  6 04:41:23 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'zabbix'
Dec  6 04:41:23 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:23.314+0000 7fecc40b7140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Dec  6 04:41:23 np0005548916 ceph-mgr[80080]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Dec  6 04:41:23 np0005548916 ceph-mgr[80080]: ms_deliver_dispatch: unhandled message 0x55b29e54d860 mon_map magic: 0 from mon.2 v2:192.168.122.101:3300/0
Dec  6 04:41:23 np0005548916 ceph-mgr[80080]: mgr handle_mgr_map respawning because set of enabled modules changed!
Dec  6 04:41:23 np0005548916 ceph-mgr[80080]: mgr respawn  e: '/usr/bin/ceph-mgr'
Dec  6 04:41:23 np0005548916 ceph-mgr[80080]: mgr respawn  0: '/usr/bin/ceph-mgr'
Dec  6 04:41:23 np0005548916 ceph-mgr[80080]: mgr respawn  1: '-n'
Dec  6 04:41:23 np0005548916 ceph-mgr[80080]: mgr respawn  2: 'mgr.compute-1.sauzid'
Dec  6 04:41:23 np0005548916 ceph-mgr[80080]: mgr respawn  3: '-f'
Dec  6 04:41:23 np0005548916 ceph-mgr[80080]: mgr respawn  4: '--setuser'
Dec  6 04:41:23 np0005548916 ceph-mgr[80080]: mgr respawn  5: 'ceph'
Dec  6 04:41:23 np0005548916 ceph-mgr[80080]: mgr respawn  6: '--setgroup'
Dec  6 04:41:23 np0005548916 ceph-mgr[80080]: mgr respawn  7: 'ceph'
Dec  6 04:41:23 np0005548916 ceph-mgr[80080]: mgr respawn  8: '--default-log-to-file=false'
Dec  6 04:41:23 np0005548916 ceph-mgr[80080]: mgr respawn  9: '--default-log-to-journald=true'
Dec  6 04:41:23 np0005548916 ceph-mgr[80080]: mgr respawn  10: '--default-log-to-stderr=false'
Dec  6 04:41:23 np0005548916 ceph-mgr[80080]: mgr respawn respawning with exe /usr/bin/ceph-mgr
Dec  6 04:41:23 np0005548916 ceph-mgr[80080]: mgr respawn  exe_path /proc/self/exe
Dec  6 04:41:23 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: ignoring --setuser ceph since I am not root
Dec  6 04:41:23 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: ignoring --setgroup ceph since I am not root
Dec  6 04:41:23 np0005548916 ceph-mgr[80080]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mgr, pid 2
Dec  6 04:41:23 np0005548916 ceph-mgr[80080]: pidfile_write: ignore empty --pid-file
Dec  6 04:41:23 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'alerts'
Dec  6 04:41:23 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:23.548+0000 7f3d65c6f140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Dec  6 04:41:23 np0005548916 ceph-mgr[80080]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Dec  6 04:41:23 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'balancer'
Dec  6 04:41:23 np0005548916 ceph-mon[79770]: Active manager daemon compute-0.qhdjwa restarted
Dec  6 04:41:23 np0005548916 ceph-mon[79770]: Activating manager daemon compute-0.qhdjwa
Dec  6 04:41:23 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:23.633+0000 7f3d65c6f140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Dec  6 04:41:23 np0005548916 ceph-mgr[80080]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Dec  6 04:41:23 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'cephadm'
Dec  6 04:41:24 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'crash'
Dec  6 04:41:24 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:24.591+0000 7f3d65c6f140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Dec  6 04:41:24 np0005548916 ceph-mgr[80080]: mgr[py] Module crash has missing NOTIFY_TYPES member
Dec  6 04:41:24 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'dashboard'
Dec  6 04:41:25 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'devicehealth'
Dec  6 04:41:25 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:25.307+0000 7f3d65c6f140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Dec  6 04:41:25 np0005548916 ceph-mgr[80080]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Dec  6 04:41:25 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'diskprediction_local'
Dec  6 04:41:25 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Dec  6 04:41:25 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Dec  6 04:41:25 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]:  from numpy import show_config as show_numpy_config
Dec  6 04:41:25 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:25.501+0000 7f3d65c6f140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Dec  6 04:41:25 np0005548916 ceph-mgr[80080]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Dec  6 04:41:25 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'influx'
Dec  6 04:41:25 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:25.582+0000 7f3d65c6f140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Dec  6 04:41:25 np0005548916 ceph-mgr[80080]: mgr[py] Module influx has missing NOTIFY_TYPES member
Dec  6 04:41:25 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'insights'
Dec  6 04:41:25 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'iostat'
Dec  6 04:41:25 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:25.736+0000 7f3d65c6f140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Dec  6 04:41:25 np0005548916 ceph-mgr[80080]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Dec  6 04:41:25 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'k8sevents'
Dec  6 04:41:26 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'localpool'
Dec  6 04:41:26 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'mds_autoscaler'
Dec  6 04:41:26 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'mirroring'
Dec  6 04:41:26 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'nfs'
Dec  6 04:41:26 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:26.873+0000 7f3d65c6f140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Dec  6 04:41:26 np0005548916 ceph-mgr[80080]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Dec  6 04:41:26 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'orchestrator'
Dec  6 04:41:27 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:27.114+0000 7f3d65c6f140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Dec  6 04:41:27 np0005548916 ceph-mgr[80080]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Dec  6 04:41:27 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'osd_perf_query'
Dec  6 04:41:27 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:27.200+0000 7f3d65c6f140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Dec  6 04:41:27 np0005548916 ceph-mgr[80080]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Dec  6 04:41:27 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'osd_support'
Dec  6 04:41:27 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:27.275+0000 7f3d65c6f140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Dec  6 04:41:27 np0005548916 ceph-mgr[80080]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Dec  6 04:41:27 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'pg_autoscaler'
Dec  6 04:41:27 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:27.371+0000 7f3d65c6f140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Dec  6 04:41:27 np0005548916 ceph-mgr[80080]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Dec  6 04:41:27 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'progress'
Dec  6 04:41:27 np0005548916 systemd[1]: Stopping User Manager for UID 42477...
Dec  6 04:41:27 np0005548916 systemd[72560]: Activating special unit Exit the Session...
Dec  6 04:41:27 np0005548916 systemd[72560]: Stopped target Main User Target.
Dec  6 04:41:27 np0005548916 systemd[72560]: Stopped target Basic System.
Dec  6 04:41:27 np0005548916 systemd[72560]: Stopped target Paths.
Dec  6 04:41:27 np0005548916 systemd[72560]: Stopped target Sockets.
Dec  6 04:41:27 np0005548916 systemd[72560]: Stopped target Timers.
Dec  6 04:41:27 np0005548916 systemd[72560]: Stopped Mark boot as successful after the user session has run 2 minutes.
Dec  6 04:41:27 np0005548916 systemd[72560]: Stopped Daily Cleanup of User's Temporary Directories.
Dec  6 04:41:27 np0005548916 systemd[72560]: Closed D-Bus User Message Bus Socket.
Dec  6 04:41:27 np0005548916 systemd[72560]: Stopped Create User's Volatile Files and Directories.
Dec  6 04:41:27 np0005548916 systemd[72560]: Removed slice User Application Slice.
Dec  6 04:41:27 np0005548916 systemd[72560]: Reached target Shutdown.
Dec  6 04:41:27 np0005548916 systemd[72560]: Finished Exit the Session.
Dec  6 04:41:27 np0005548916 systemd[72560]: Reached target Exit the Session.
Dec  6 04:41:27 np0005548916 systemd[1]: user@42477.service: Deactivated successfully.
Dec  6 04:41:27 np0005548916 systemd[1]: Stopped User Manager for UID 42477.
Dec  6 04:41:27 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:27.455+0000 7f3d65c6f140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Dec  6 04:41:27 np0005548916 ceph-mgr[80080]: mgr[py] Module progress has missing NOTIFY_TYPES member
Dec  6 04:41:27 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'prometheus'
Dec  6 04:41:27 np0005548916 systemd[1]: Stopping User Runtime Directory /run/user/42477...
Dec  6 04:41:27 np0005548916 systemd[1]: run-user-42477.mount: Deactivated successfully.
Dec  6 04:41:27 np0005548916 systemd[1]: user-runtime-dir@42477.service: Deactivated successfully.
Dec  6 04:41:27 np0005548916 systemd[1]: Stopped User Runtime Directory /run/user/42477.
Dec  6 04:41:27 np0005548916 systemd[1]: Removed slice User Slice of UID 42477.
Dec  6 04:41:27 np0005548916 systemd[1]: user-42477.slice: Consumed 1min 32.940s CPU time.
Dec  6 04:41:27 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:27.837+0000 7f3d65c6f140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Dec  6 04:41:27 np0005548916 ceph-mgr[80080]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Dec  6 04:41:27 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'rbd_support'
Dec  6 04:41:27 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e37 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  6 04:41:27 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:27.947+0000 7f3d65c6f140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Dec  6 04:41:27 np0005548916 ceph-mgr[80080]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Dec  6 04:41:27 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'restful'
Dec  6 04:41:28 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'rgw'
Dec  6 04:41:28 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:28.417+0000 7f3d65c6f140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Dec  6 04:41:28 np0005548916 ceph-mgr[80080]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Dec  6 04:41:28 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'rook'
Dec  6 04:41:28 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e38 e38: 3 total, 3 up, 3 in
Dec  6 04:41:29 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:29.087+0000 7f3d65c6f140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Dec  6 04:41:29 np0005548916 ceph-mgr[80080]: mgr[py] Module rook has missing NOTIFY_TYPES member
Dec  6 04:41:29 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'selftest'
Dec  6 04:41:29 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:29.166+0000 7f3d65c6f140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Dec  6 04:41:29 np0005548916 ceph-mgr[80080]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Dec  6 04:41:29 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'snap_schedule'
Dec  6 04:41:29 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:29.254+0000 7f3d65c6f140 -1 mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Dec  6 04:41:29 np0005548916 ceph-mgr[80080]: mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Dec  6 04:41:29 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'stats'
Dec  6 04:41:29 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'status'
Dec  6 04:41:29 np0005548916 systemd[1]: Created slice User Slice of UID 42477.
Dec  6 04:41:29 np0005548916 systemd[1]: Starting User Runtime Directory /run/user/42477...
Dec  6 04:41:29 np0005548916 systemd-logind[788]: New session 34 of user ceph-admin.
Dec  6 04:41:29 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:29.398+0000 7f3d65c6f140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Dec  6 04:41:29 np0005548916 ceph-mgr[80080]: mgr[py] Module status has missing NOTIFY_TYPES member
Dec  6 04:41:29 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'telegraf'
Dec  6 04:41:29 np0005548916 systemd[1]: Finished User Runtime Directory /run/user/42477.
Dec  6 04:41:29 np0005548916 systemd[1]: Starting User Manager for UID 42477...
Dec  6 04:41:29 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:29.471+0000 7f3d65c6f140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Dec  6 04:41:29 np0005548916 ceph-mgr[80080]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Dec  6 04:41:29 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'telemetry'
Dec  6 04:41:29 np0005548916 systemd[81504]: Queued start job for default target Main User Target.
Dec  6 04:41:29 np0005548916 systemd[81504]: Created slice User Application Slice.
Dec  6 04:41:29 np0005548916 systemd[81504]: Started Mark boot as successful after the user session has run 2 minutes.
Dec  6 04:41:29 np0005548916 systemd[81504]: Started Daily Cleanup of User's Temporary Directories.
Dec  6 04:41:29 np0005548916 systemd[81504]: Reached target Paths.
Dec  6 04:41:29 np0005548916 systemd[81504]: Reached target Timers.
Dec  6 04:41:29 np0005548916 systemd[81504]: Starting D-Bus User Message Bus Socket...
Dec  6 04:41:29 np0005548916 systemd[81504]: Starting Create User's Volatile Files and Directories...
Dec  6 04:41:29 np0005548916 systemd[81504]: Listening on D-Bus User Message Bus Socket.
Dec  6 04:41:29 np0005548916 systemd[81504]: Finished Create User's Volatile Files and Directories.
Dec  6 04:41:29 np0005548916 systemd[81504]: Reached target Sockets.
Dec  6 04:41:29 np0005548916 systemd[81504]: Reached target Basic System.
Dec  6 04:41:29 np0005548916 systemd[81504]: Reached target Main User Target.
Dec  6 04:41:29 np0005548916 systemd[81504]: Startup finished in 133ms.
Dec  6 04:41:29 np0005548916 systemd[1]: Started User Manager for UID 42477.
Dec  6 04:41:29 np0005548916 systemd[1]: Started Session 34 of User ceph-admin.
Dec  6 04:41:29 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:29.627+0000 7f3d65c6f140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Dec  6 04:41:29 np0005548916 ceph-mgr[80080]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Dec  6 04:41:29 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'test_orchestrator'
Dec  6 04:41:29 np0005548916 ceph-mon[79770]: Active manager daemon compute-0.qhdjwa restarted
Dec  6 04:41:29 np0005548916 ceph-mon[79770]: Activating manager daemon compute-0.qhdjwa
Dec  6 04:41:29 np0005548916 ceph-mon[79770]: Manager daemon compute-0.qhdjwa is now available
Dec  6 04:41:29 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.qhdjwa/mirror_snapshot_schedule"}]: dispatch
Dec  6 04:41:29 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.qhdjwa/trash_purge_schedule"}]: dispatch
Dec  6 04:41:29 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:29.856+0000 7f3d65c6f140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Dec  6 04:41:29 np0005548916 ceph-mgr[80080]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Dec  6 04:41:29 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'volumes'
Dec  6 04:41:29 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).mds e2 new map
Dec  6 04:41:29 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).mds e2 print_map#012e2#012btime 2025-12-06T09:41:29:967825+0000#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0112#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112025-12-06T09:41:29.967778+0000#012modified#0112025-12-06T09:41:29.967778+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012max_mds#0111#012in#011#012up#011{}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0110#012qdb_cluster#011leader: 0 members: #012 #012 
Dec  6 04:41:29 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e39 e39: 3 total, 3 up, 3 in
Dec  6 04:41:30 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:30.157+0000 7f3d65c6f140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Dec  6 04:41:30 np0005548916 ceph-mgr[80080]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Dec  6 04:41:30 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'zabbix'
Dec  6 04:41:30 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:30.242+0000 7f3d65c6f140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Dec  6 04:41:30 np0005548916 ceph-mgr[80080]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Dec  6 04:41:30 np0005548916 ceph-mgr[80080]: [dashboard DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec  6 04:41:30 np0005548916 ceph-mgr[80080]: mgr load Constructed class from module: dashboard
Dec  6 04:41:30 np0005548916 ceph-mgr[80080]: [dashboard INFO root] server: ssl=no host=192.168.122.101 port=8443
Dec  6 04:41:30 np0005548916 ceph-mgr[80080]: [dashboard INFO root] Configured CherryPy, starting engine...
Dec  6 04:41:30 np0005548916 ceph-mgr[80080]: [dashboard INFO root] Starting engine...
Dec  6 04:41:30 np0005548916 ceph-mgr[80080]: ms_deliver_dispatch: unhandled message 0x56550deb1860 mon_map magic: 0 from mon.2 v2:192.168.122.101:3300/0
Dec  6 04:41:30 np0005548916 ceph-mgr[80080]: [dashboard INFO root] Engine started...
Dec  6 04:41:30 np0005548916 podman[81652]: 2025-12-06 09:41:30.441162829 +0000 UTC m=+0.098113352 container exec 500f8c89b5c281d0ace139835b07ea5bc6259ce3e028d8284d57da97424b55e0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-crash-compute-1, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid)
Dec  6 04:41:30 np0005548916 podman[81652]: 2025-12-06 09:41:30.533959907 +0000 UTC m=+0.190910390 container exec_died 500f8c89b5c281d0ace139835b07ea5bc6259ce3e028d8284d57da97424b55e0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-crash-compute-1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.license=GPLv2)
Dec  6 04:41:30 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta"}]: dispatch
Dec  6 04:41:30 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"bulk": true, "prefix": "osd pool create", "pool": "cephfs.cephfs.data"}]: dispatch
Dec  6 04:41:30 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "fs new", "fs_name": "cephfs", "metadata": "cephfs.cephfs.meta", "data": "cephfs.cephfs.data"}]: dispatch
Dec  6 04:41:30 np0005548916 ceph-mon[79770]: Health check failed: 1 filesystem is offline (MDS_ALL_DOWN)
Dec  6 04:41:30 np0005548916 ceph-mon[79770]: Health check failed: 1 filesystem is online with fewer MDS than max_mds (MDS_UP_LESS_THAN_MAX)
Dec  6 04:41:30 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "fs new", "fs_name": "cephfs", "metadata": "cephfs.cephfs.meta", "data": "cephfs.cephfs.data"}]': finished
Dec  6 04:41:30 np0005548916 ceph-mon[79770]: Saving service mds.cephfs spec with placement compute-0;compute-1;compute-2
Dec  6 04:41:30 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:30 np0005548916 ceph-mon[79770]: [06/Dec/2025:09:41:30] ENGINE Bus STARTING
Dec  6 04:41:30 np0005548916 ceph-mon[79770]: [06/Dec/2025:09:41:30] ENGINE Serving on https://192.168.122.100:7150
Dec  6 04:41:30 np0005548916 ceph-mon[79770]: [06/Dec/2025:09:41:30] ENGINE Client ('192.168.122.100', 59318) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Dec  6 04:41:30 np0005548916 ceph-mon[79770]: [06/Dec/2025:09:41:30] ENGINE Serving on http://192.168.122.100:8765
Dec  6 04:41:30 np0005548916 ceph-mon[79770]: [06/Dec/2025:09:41:30] ENGINE Bus STARTED
Dec  6 04:41:30 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:30 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:30 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:30 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:31 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:31 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:31 np0005548916 ceph-mon[79770]: Saving service mds.cephfs spec with placement compute-0;compute-1;compute-2
Dec  6 04:41:31 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:31 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:31 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:31 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"}]: dispatch
Dec  6 04:41:32 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e39 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  6 04:41:32 np0005548916 ceph-mon[79770]: Adjusting osd_memory_target on compute-2 to 127.9M
Dec  6 04:41:32 np0005548916 ceph-mon[79770]: Unable to set osd_memory_target on compute-2 to 134214860: error parsing value: Value '134214860' is below minimum 939524096
Dec  6 04:41:32 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool create", "pool": ".nfs", "yes_i_really_mean_it": true}]: dispatch
Dec  6 04:41:32 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:32 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:32 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"}]: dispatch
Dec  6 04:41:32 np0005548916 ceph-mon[79770]: Adjusting osd_memory_target on compute-0 to 128.0M
Dec  6 04:41:32 np0005548916 ceph-mon[79770]: Unable to set osd_memory_target on compute-0 to 134217728: error parsing value: Value '134217728' is below minimum 939524096
Dec  6 04:41:32 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:32 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:32 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"}]: dispatch
Dec  6 04:41:32 np0005548916 ceph-mon[79770]: Adjusting osd_memory_target on compute-1 to 127.9M
Dec  6 04:41:32 np0005548916 ceph-mon[79770]: Unable to set osd_memory_target on compute-1 to 134211993: error parsing value: Value '134211993' is below minimum 939524096
Dec  6 04:41:32 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  6 04:41:32 np0005548916 ceph-mon[79770]: Updating compute-0:/etc/ceph/ceph.conf
Dec  6 04:41:32 np0005548916 ceph-mon[79770]: Updating compute-1:/etc/ceph/ceph.conf
Dec  6 04:41:32 np0005548916 ceph-mon[79770]: Updating compute-2:/etc/ceph/ceph.conf
Dec  6 04:41:33 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e40 e40: 3 total, 3 up, 3 in
Dec  6 04:41:33 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 40 pg[8.0( empty local-lis/les=0/0 n=0 ec=40/40 lis/c=0/0 les/c/f=0/0/0 sis=40) [0] r=0 lpr=40 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:41:34 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e41 e41: 3 total, 3 up, 3 in
Dec  6 04:41:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 41 pg[8.0( empty local-lis/les=40/41 n=0 ec=40/40 lis/c=0/0 les/c/f=0/0/0 sis=40) [0] r=0 lpr=40 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:41:34 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool create", "pool": ".nfs", "yes_i_really_mean_it": true}]': finished
Dec  6 04:41:34 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool application enable", "pool": ".nfs", "app": "nfs"}]: dispatch
Dec  6 04:41:34 np0005548916 ceph-mon[79770]: Updating compute-2:/var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/config/ceph.conf
Dec  6 04:41:34 np0005548916 ceph-mon[79770]: Updating compute-0:/var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/config/ceph.conf
Dec  6 04:41:34 np0005548916 ceph-mon[79770]: Updating compute-1:/var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/config/ceph.conf
Dec  6 04:41:34 np0005548916 ceph-mon[79770]: Updating compute-2:/etc/ceph/ceph.client.admin.keyring
Dec  6 04:41:34 np0005548916 ceph-mon[79770]: Updating compute-0:/etc/ceph/ceph.client.admin.keyring
Dec  6 04:41:35 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool application enable", "pool": ".nfs", "app": "nfs"}]': finished
Dec  6 04:41:35 np0005548916 ceph-mon[79770]: Updating compute-1:/etc/ceph/ceph.client.admin.keyring
Dec  6 04:41:35 np0005548916 ceph-mon[79770]: Saving service nfs.cephfs spec with placement compute-0;compute-1;compute-2
Dec  6 04:41:35 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:35 np0005548916 ceph-mon[79770]: Saving service ingress.nfs.cephfs spec with placement compute-0;compute-1;compute-2
Dec  6 04:41:35 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:35 np0005548916 ceph-mon[79770]: Updating compute-2:/var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/config/ceph.client.admin.keyring
Dec  6 04:41:35 np0005548916 ceph-mon[79770]: Updating compute-0:/var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/config/ceph.client.admin.keyring
Dec  6 04:41:35 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:35 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:35 np0005548916 ceph-mon[79770]: Updating compute-1:/var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/config/ceph.client.admin.keyring
Dec  6 04:41:35 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:35 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:35 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e42 e42: 3 total, 3 up, 3 in
Dec  6 04:41:36 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:36 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:36 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:36 np0005548916 ceph-mon[79770]: Deploying daemon node-exporter.compute-0 on compute-0
Dec  6 04:41:37 np0005548916 ceph-mon[79770]: from='client.? 192.168.122.100:0/351927990' entity='client.admin' cmd=[{"prefix": "auth import"}]: dispatch
Dec  6 04:41:37 np0005548916 ceph-mon[79770]: from='client.? 192.168.122.100:0/351927990' entity='client.admin' cmd='[{"prefix": "auth import"}]': finished
Dec  6 04:41:37 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e42 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  6 04:41:38 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:38 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:38 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:38 np0005548916 systemd[1]: Reloading.
Dec  6 04:41:39 np0005548916 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:41:39 np0005548916 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:41:39 np0005548916 systemd[1]: Reloading.
Dec  6 04:41:39 np0005548916 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:41:39 np0005548916 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:41:39 np0005548916 systemd[1]: Starting Ceph node-exporter.compute-1 for 5ecd3f74-dade-5fc4-92ce-8950ae424258...
Dec  6 04:41:39 np0005548916 ceph-mon[79770]: Deploying daemon node-exporter.compute-1 on compute-1
Dec  6 04:41:39 np0005548916 bash[82997]: Trying to pull quay.io/prometheus/node-exporter:v1.7.0...
Dec  6 04:41:40 np0005548916 bash[82997]: Getting image source signatures
Dec  6 04:41:40 np0005548916 bash[82997]: Copying blob sha256:2abcce694348cd2c949c0e98a7400ebdfd8341021bcf6b541bc72033ce982510
Dec  6 04:41:40 np0005548916 bash[82997]: Copying blob sha256:455fd88e5221bc1e278ef2d059cd70e4df99a24e5af050ede621534276f6cf9a
Dec  6 04:41:40 np0005548916 bash[82997]: Copying blob sha256:324153f2810a9927fcce320af9e4e291e0b6e805cbdd1f338386c756b9defa24
Dec  6 04:41:40 np0005548916 bash[82997]: Copying config sha256:72c9c208898624938c9e4183d6686ea4a5fd3f912bc29bc3f00147924c521a3e
Dec  6 04:41:40 np0005548916 bash[82997]: Writing manifest to image destination
Dec  6 04:41:40 np0005548916 podman[82997]: 2025-12-06 09:41:40.840037377 +0000 UTC m=+1.078055011 container create 6af22af7046e22bedbb2fb280e4d2c530c5b3cac3959f396bf7fe3d14752a7eb (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec  6 04:41:40 np0005548916 podman[82997]: 2025-12-06 09:41:40.81834396 +0000 UTC m=+1.056361634 image pull 72c9c208898624938c9e4183d6686ea4a5fd3f912bc29bc3f00147924c521a3e quay.io/prometheus/node-exporter:v1.7.0
Dec  6 04:41:40 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/23e7edaacd69ae45d5bad71eeb3f011b2043921644e9cc36e86eee43df0ce8ca/merged/etc/node-exporter supports timestamps until 2038 (0x7fffffff)
Dec  6 04:41:40 np0005548916 podman[82997]: 2025-12-06 09:41:40.905353682 +0000 UTC m=+1.143371376 container init 6af22af7046e22bedbb2fb280e4d2c530c5b3cac3959f396bf7fe3d14752a7eb (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec  6 04:41:40 np0005548916 podman[82997]: 2025-12-06 09:41:40.910650006 +0000 UTC m=+1.148667640 container start 6af22af7046e22bedbb2fb280e4d2c530c5b3cac3959f396bf7fe3d14752a7eb (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec  6 04:41:40 np0005548916 bash[82997]: 6af22af7046e22bedbb2fb280e4d2c530c5b3cac3959f396bf7fe3d14752a7eb
Dec  6 04:41:40 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1[83073]: ts=2025-12-06T09:41:40.919Z caller=node_exporter.go:192 level=info msg="Starting node_exporter" version="(version=1.7.0, branch=HEAD, revision=7333465abf9efba81876303bb57e6fadb946041b)"
Dec  6 04:41:40 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1[83073]: ts=2025-12-06T09:41:40.919Z caller=node_exporter.go:193 level=info msg="Build context" build_context="(go=go1.21.4, platform=linux/amd64, user=root@35918982f6d8, date=20231112-23:53:35, tags=netgo osusergo static_build)"
Dec  6 04:41:40 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1[83073]: ts=2025-12-06T09:41:40.920Z caller=diskstats_common.go:111 level=info collector=diskstats msg="Parsed flag --collector.diskstats.device-exclude" flag=^(ram|loop|fd|(h|s|v|xv)d[a-z]|nvme\d+n\d+p)\d+$
Dec  6 04:41:40 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1[83073]: ts=2025-12-06T09:41:40.920Z caller=diskstats_linux.go:265 level=error collector=diskstats msg="Failed to open directory, disabling udev device properties" path=/run/udev/data
Dec  6 04:41:40 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1[83073]: ts=2025-12-06T09:41:40.920Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(dev|proc|run/credentials/.+|sys|var/lib/docker/.+|var/lib/containers/storage/.+)($|/)
Dec  6 04:41:40 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1[83073]: ts=2025-12-06T09:41:40.920Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^(autofs|binfmt_misc|bpf|cgroup2?|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|iso9660|mqueue|nsfs|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|selinuxfs|squashfs|sysfs|tracefs)$
Dec  6 04:41:40 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1[83073]: ts=2025-12-06T09:41:40.920Z caller=node_exporter.go:110 level=info msg="Enabled collectors"
Dec  6 04:41:40 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1[83073]: ts=2025-12-06T09:41:40.921Z caller=node_exporter.go:117 level=info collector=arp
Dec  6 04:41:40 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1[83073]: ts=2025-12-06T09:41:40.921Z caller=node_exporter.go:117 level=info collector=bcache
Dec  6 04:41:40 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1[83073]: ts=2025-12-06T09:41:40.921Z caller=node_exporter.go:117 level=info collector=bonding
Dec  6 04:41:40 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1[83073]: ts=2025-12-06T09:41:40.921Z caller=node_exporter.go:117 level=info collector=btrfs
Dec  6 04:41:40 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1[83073]: ts=2025-12-06T09:41:40.921Z caller=node_exporter.go:117 level=info collector=conntrack
Dec  6 04:41:40 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1[83073]: ts=2025-12-06T09:41:40.921Z caller=node_exporter.go:117 level=info collector=cpu
Dec  6 04:41:40 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1[83073]: ts=2025-12-06T09:41:40.921Z caller=node_exporter.go:117 level=info collector=cpufreq
Dec  6 04:41:40 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1[83073]: ts=2025-12-06T09:41:40.921Z caller=node_exporter.go:117 level=info collector=diskstats
Dec  6 04:41:40 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1[83073]: ts=2025-12-06T09:41:40.921Z caller=node_exporter.go:117 level=info collector=dmi
Dec  6 04:41:40 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1[83073]: ts=2025-12-06T09:41:40.921Z caller=node_exporter.go:117 level=info collector=edac
Dec  6 04:41:40 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1[83073]: ts=2025-12-06T09:41:40.921Z caller=node_exporter.go:117 level=info collector=entropy
Dec  6 04:41:40 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1[83073]: ts=2025-12-06T09:41:40.921Z caller=node_exporter.go:117 level=info collector=fibrechannel
Dec  6 04:41:40 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1[83073]: ts=2025-12-06T09:41:40.921Z caller=node_exporter.go:117 level=info collector=filefd
Dec  6 04:41:40 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1[83073]: ts=2025-12-06T09:41:40.921Z caller=node_exporter.go:117 level=info collector=filesystem
Dec  6 04:41:40 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1[83073]: ts=2025-12-06T09:41:40.921Z caller=node_exporter.go:117 level=info collector=hwmon
Dec  6 04:41:40 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1[83073]: ts=2025-12-06T09:41:40.921Z caller=node_exporter.go:117 level=info collector=infiniband
Dec  6 04:41:40 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1[83073]: ts=2025-12-06T09:41:40.921Z caller=node_exporter.go:117 level=info collector=ipvs
Dec  6 04:41:40 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1[83073]: ts=2025-12-06T09:41:40.921Z caller=node_exporter.go:117 level=info collector=loadavg
Dec  6 04:41:40 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1[83073]: ts=2025-12-06T09:41:40.921Z caller=node_exporter.go:117 level=info collector=mdadm
Dec  6 04:41:40 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1[83073]: ts=2025-12-06T09:41:40.921Z caller=node_exporter.go:117 level=info collector=meminfo
Dec  6 04:41:40 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1[83073]: ts=2025-12-06T09:41:40.921Z caller=node_exporter.go:117 level=info collector=netclass
Dec  6 04:41:40 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1[83073]: ts=2025-12-06T09:41:40.921Z caller=node_exporter.go:117 level=info collector=netdev
Dec  6 04:41:40 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1[83073]: ts=2025-12-06T09:41:40.921Z caller=node_exporter.go:117 level=info collector=netstat
Dec  6 04:41:40 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1[83073]: ts=2025-12-06T09:41:40.921Z caller=node_exporter.go:117 level=info collector=nfs
Dec  6 04:41:40 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1[83073]: ts=2025-12-06T09:41:40.921Z caller=node_exporter.go:117 level=info collector=nfsd
Dec  6 04:41:40 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1[83073]: ts=2025-12-06T09:41:40.921Z caller=node_exporter.go:117 level=info collector=nvme
Dec  6 04:41:40 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1[83073]: ts=2025-12-06T09:41:40.921Z caller=node_exporter.go:117 level=info collector=os
Dec  6 04:41:40 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1[83073]: ts=2025-12-06T09:41:40.921Z caller=node_exporter.go:117 level=info collector=powersupplyclass
Dec  6 04:41:40 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1[83073]: ts=2025-12-06T09:41:40.921Z caller=node_exporter.go:117 level=info collector=pressure
Dec  6 04:41:40 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1[83073]: ts=2025-12-06T09:41:40.921Z caller=node_exporter.go:117 level=info collector=rapl
Dec  6 04:41:40 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1[83073]: ts=2025-12-06T09:41:40.921Z caller=node_exporter.go:117 level=info collector=schedstat
Dec  6 04:41:40 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1[83073]: ts=2025-12-06T09:41:40.921Z caller=node_exporter.go:117 level=info collector=selinux
Dec  6 04:41:40 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1[83073]: ts=2025-12-06T09:41:40.921Z caller=node_exporter.go:117 level=info collector=sockstat
Dec  6 04:41:40 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1[83073]: ts=2025-12-06T09:41:40.921Z caller=node_exporter.go:117 level=info collector=softnet
Dec  6 04:41:40 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1[83073]: ts=2025-12-06T09:41:40.921Z caller=node_exporter.go:117 level=info collector=stat
Dec  6 04:41:40 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1[83073]: ts=2025-12-06T09:41:40.921Z caller=node_exporter.go:117 level=info collector=tapestats
Dec  6 04:41:40 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1[83073]: ts=2025-12-06T09:41:40.921Z caller=node_exporter.go:117 level=info collector=textfile
Dec  6 04:41:40 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1[83073]: ts=2025-12-06T09:41:40.921Z caller=node_exporter.go:117 level=info collector=thermal_zone
Dec  6 04:41:40 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1[83073]: ts=2025-12-06T09:41:40.921Z caller=node_exporter.go:117 level=info collector=time
Dec  6 04:41:40 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1[83073]: ts=2025-12-06T09:41:40.921Z caller=node_exporter.go:117 level=info collector=udp_queues
Dec  6 04:41:40 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1[83073]: ts=2025-12-06T09:41:40.921Z caller=node_exporter.go:117 level=info collector=uname
Dec  6 04:41:40 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1[83073]: ts=2025-12-06T09:41:40.921Z caller=node_exporter.go:117 level=info collector=vmstat
Dec  6 04:41:40 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1[83073]: ts=2025-12-06T09:41:40.921Z caller=node_exporter.go:117 level=info collector=xfs
Dec  6 04:41:40 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1[83073]: ts=2025-12-06T09:41:40.922Z caller=node_exporter.go:117 level=info collector=zfs
Dec  6 04:41:40 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1[83073]: ts=2025-12-06T09:41:40.923Z caller=tls_config.go:274 level=info msg="Listening on" address=[::]:9100
Dec  6 04:41:40 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1[83073]: ts=2025-12-06T09:41:40.923Z caller=tls_config.go:277 level=info msg="TLS is disabled." http2=false address=[::]:9100
Dec  6 04:41:40 np0005548916 systemd[1]: Started Ceph node-exporter.compute-1 for 5ecd3f74-dade-5fc4-92ce-8950ae424258.
Dec  6 04:41:41 np0005548916 ceph-mon[79770]: from='client.? 192.168.122.100:0/1032166629' entity='client.admin' cmd=[{"prefix": "auth get", "entity": "client.openstack"}]: dispatch
Dec  6 04:41:41 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:41 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:41 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:42 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e42 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  6 04:41:43 np0005548916 ceph-mon[79770]: Deploying daemon node-exporter.compute-2 on compute-2
Dec  6 04:41:44 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:44 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:44 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:44 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:44 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  6 04:41:44 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:47 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e42 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  6 04:41:49 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:49 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:49 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-2.qizhkr", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Dec  6 04:41:49 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-2.qizhkr", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]': finished
Dec  6 04:41:49 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:49 np0005548916 ceph-mon[79770]: Deploying daemon rgw.rgw.compute-2.qizhkr on compute-2
Dec  6 04:41:51 np0005548916 podman[83172]: 2025-12-06 09:41:51.363445065 +0000 UTC m=+0.053251775 container create 40d29fec1885b4f15c1ecf2afe342b1c1845c682e51d5d7f40a18234b69c9472 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=cool_darwin, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  6 04:41:51 np0005548916 systemd[1]: Started libpod-conmon-40d29fec1885b4f15c1ecf2afe342b1c1845c682e51d5d7f40a18234b69c9472.scope.
Dec  6 04:41:51 np0005548916 podman[83172]: 2025-12-06 09:41:51.333032744 +0000 UTC m=+0.022839494 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  6 04:41:51 np0005548916 systemd[1]: Started libcrun container.
Dec  6 04:41:51 np0005548916 podman[83172]: 2025-12-06 09:41:51.462116468 +0000 UTC m=+0.151923238 container init 40d29fec1885b4f15c1ecf2afe342b1c1845c682e51d5d7f40a18234b69c9472 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=cool_darwin, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True)
Dec  6 04:41:51 np0005548916 podman[83172]: 2025-12-06 09:41:51.469406279 +0000 UTC m=+0.159213059 container start 40d29fec1885b4f15c1ecf2afe342b1c1845c682e51d5d7f40a18234b69c9472 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=cool_darwin, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.build-date=20250325)
Dec  6 04:41:51 np0005548916 podman[83172]: 2025-12-06 09:41:51.473772211 +0000 UTC m=+0.163578961 container attach 40d29fec1885b4f15c1ecf2afe342b1c1845c682e51d5d7f40a18234b69c9472 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=cool_darwin, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1)
Dec  6 04:41:51 np0005548916 cool_darwin[83188]: 167 167
Dec  6 04:41:51 np0005548916 systemd[1]: libpod-40d29fec1885b4f15c1ecf2afe342b1c1845c682e51d5d7f40a18234b69c9472.scope: Deactivated successfully.
Dec  6 04:41:51 np0005548916 podman[83172]: 2025-12-06 09:41:51.479049804 +0000 UTC m=+0.168856534 container died 40d29fec1885b4f15c1ecf2afe342b1c1845c682e51d5d7f40a18234b69c9472 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=cool_darwin, OSD_FLAVOR=default, org.label-schema.build-date=20250325, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Dec  6 04:41:51 np0005548916 systemd[1]: var-lib-containers-storage-overlay-86937cdc766307aff207e76b35a8eca25302da0280b0efceb85d37c406cb5ca1-merged.mount: Deactivated successfully.
Dec  6 04:41:51 np0005548916 podman[83172]: 2025-12-06 09:41:51.539719901 +0000 UTC m=+0.229526631 container remove 40d29fec1885b4f15c1ecf2afe342b1c1845c682e51d5d7f40a18234b69c9472 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=cool_darwin, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, OSD_FLAVOR=default, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  6 04:41:51 np0005548916 systemd[1]: libpod-conmon-40d29fec1885b4f15c1ecf2afe342b1c1845c682e51d5d7f40a18234b69c9472.scope: Deactivated successfully.
Dec  6 04:41:51 np0005548916 systemd[1]: Reloading.
Dec  6 04:41:51 np0005548916 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:41:51 np0005548916 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:41:51 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e43 e43: 3 total, 3 up, 3 in
Dec  6 04:41:51 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 43 pg[9.0( empty local-lis/les=0/0 n=0 ec=43/43 lis/c=0/0 les/c/f=0/0/0 sis=43) [0] r=0 lpr=43 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:41:51 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:51 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:51 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:51 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-1.oqhsdh", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Dec  6 04:41:51 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-1.oqhsdh", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]': finished
Dec  6 04:41:51 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:51 np0005548916 ceph-mon[79770]: Deploying daemon rgw.rgw.compute-1.oqhsdh on compute-1
Dec  6 04:41:51 np0005548916 systemd[1]: Reloading.
Dec  6 04:41:51 np0005548916 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:41:51 np0005548916 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:41:52 np0005548916 systemd[1]: Starting Ceph rgw.rgw.compute-1.oqhsdh for 5ecd3f74-dade-5fc4-92ce-8950ae424258...
Dec  6 04:41:52 np0005548916 podman[83334]: 2025-12-06 09:41:52.4117885 +0000 UTC m=+0.047707115 container create 99c2e5d092334c4a30122097374eccc65942e47288d38585ee146c9055718bdf (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-rgw-rgw-compute-1-oqhsdh, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec  6 04:41:52 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/395746ebd40524aacce127092e2096cbba77a7b0eb9433a716457938c170aeba/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  6 04:41:52 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/395746ebd40524aacce127092e2096cbba77a7b0eb9433a716457938c170aeba/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  6 04:41:52 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/395746ebd40524aacce127092e2096cbba77a7b0eb9433a716457938c170aeba/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  6 04:41:52 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/395746ebd40524aacce127092e2096cbba77a7b0eb9433a716457938c170aeba/merged/var/lib/ceph/radosgw/ceph-rgw.rgw.compute-1.oqhsdh supports timestamps until 2038 (0x7fffffff)
Dec  6 04:41:52 np0005548916 podman[83334]: 2025-12-06 09:41:52.387223006 +0000 UTC m=+0.023141641 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  6 04:41:52 np0005548916 podman[83334]: 2025-12-06 09:41:52.489643389 +0000 UTC m=+0.125562054 container init 99c2e5d092334c4a30122097374eccc65942e47288d38585ee146c9055718bdf (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-rgw-rgw-compute-1-oqhsdh, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, ceph=True, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  6 04:41:52 np0005548916 podman[83334]: 2025-12-06 09:41:52.494690957 +0000 UTC m=+0.130609592 container start 99c2e5d092334c4a30122097374eccc65942e47288d38585ee146c9055718bdf (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-rgw-rgw-compute-1-oqhsdh, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True)
Dec  6 04:41:52 np0005548916 bash[83334]: 99c2e5d092334c4a30122097374eccc65942e47288d38585ee146c9055718bdf
Dec  6 04:41:52 np0005548916 systemd[1]: Started Ceph rgw.rgw.compute-1.oqhsdh for 5ecd3f74-dade-5fc4-92ce-8950ae424258.
Dec  6 04:41:52 np0005548916 radosgw[83354]: deferred set uid:gid to 167:167 (ceph:ceph)
Dec  6 04:41:52 np0005548916 radosgw[83354]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process radosgw, pid 2
Dec  6 04:41:52 np0005548916 radosgw[83354]: framework: beast
Dec  6 04:41:52 np0005548916 radosgw[83354]: framework conf key: endpoint, val: 192.168.122.101:8082
Dec  6 04:41:52 np0005548916 radosgw[83354]: init_numa not setting numa affinity
Dec  6 04:41:52 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e44 e44: 3 total, 3 up, 3 in
Dec  6 04:41:52 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 44 pg[9.0( empty local-lis/les=43/44 n=0 ec=43/43 lis/c=0/0 les/c/f=0/0/0 sis=43) [0] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:41:52 np0005548916 ceph-mon[79770]: from='client.? ' entity='client.rgw.rgw.compute-2.qizhkr' cmd=[{"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"}]: dispatch
Dec  6 04:41:52 np0005548916 ceph-mon[79770]: from='client.? 192.168.122.102:0/3027759423' entity='client.rgw.rgw.compute-2.qizhkr' cmd=[{"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"}]: dispatch
Dec  6 04:41:52 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:52 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:52 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:52 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-0.zktslo", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Dec  6 04:41:52 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-0.zktslo", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]': finished
Dec  6 04:41:52 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:52 np0005548916 ceph-mon[79770]: from='client.? ' entity='client.rgw.rgw.compute-2.qizhkr' cmd='[{"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"}]': finished
Dec  6 04:41:52 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e44 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  6 04:41:53 np0005548916 ceph-mon[79770]: Deploying daemon rgw.rgw.compute-0.zktslo on compute-0
Dec  6 04:41:53 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e45 e45: 3 total, 3 up, 3 in
Dec  6 04:41:53 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"} v 0)
Dec  6 04:41:53 np0005548916 ceph-mon[79770]: log_channel(audit) log [INF] : from='client.? 192.168.122.101:0/4120731466' entity='client.rgw.rgw.compute-1.oqhsdh' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Dec  6 04:41:54 np0005548916 ceph-mon[79770]: from='client.? 192.168.122.101:0/4120731466' entity='client.rgw.rgw.compute-1.oqhsdh' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Dec  6 04:41:54 np0005548916 ceph-mon[79770]: from='client.? ' entity='client.rgw.rgw.compute-1.oqhsdh' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Dec  6 04:41:54 np0005548916 ceph-mon[79770]: from='client.? ' entity='client.rgw.rgw.compute-2.qizhkr' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Dec  6 04:41:54 np0005548916 ceph-mon[79770]: from='client.? 192.168.122.102:0/827372016' entity='client.rgw.rgw.compute-2.qizhkr' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Dec  6 04:41:54 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:54 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:54 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:54 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:54 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:54 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-2.czucwy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch
Dec  6 04:41:54 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-2.czucwy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished
Dec  6 04:41:54 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e46 e46: 3 total, 3 up, 3 in
Dec  6 04:41:55 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e47 e47: 3 total, 3 up, 3 in
Dec  6 04:41:55 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 47 pg[11.0( empty local-lis/les=0/0 n=0 ec=47/47 lis/c=0/0 les/c/f=0/0/0 sis=47) [0] r=0 lpr=47 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:41:55 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"} v 0)
Dec  6 04:41:55 np0005548916 ceph-mon[79770]: log_channel(audit) log [INF] : from='client.? 192.168.122.101:0/4120731466' entity='client.rgw.rgw.compute-1.oqhsdh' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Dec  6 04:41:55 np0005548916 ceph-mon[79770]: Saving service rgw.rgw spec with placement compute-0;compute-1;compute-2
Dec  6 04:41:55 np0005548916 ceph-mon[79770]: Deploying daemon mds.cephfs.compute-2.czucwy on compute-2
Dec  6 04:41:55 np0005548916 ceph-mon[79770]: from='client.? ' entity='client.rgw.rgw.compute-1.oqhsdh' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]': finished
Dec  6 04:41:55 np0005548916 ceph-mon[79770]: from='client.? ' entity='client.rgw.rgw.compute-2.qizhkr' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]': finished
Dec  6 04:41:56 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e48 e48: 3 total, 3 up, 3 in
Dec  6 04:41:56 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 48 pg[11.0( empty local-lis/les=47/48 n=0 ec=47/47 lis/c=0/0 les/c/f=0/0/0 sis=47) [0] r=0 lpr=47 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:41:56 np0005548916 ceph-mon[79770]: from='client.? ' entity='client.rgw.rgw.compute-2.qizhkr' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Dec  6 04:41:56 np0005548916 ceph-mon[79770]: from='client.? 192.168.122.100:0/1940551259' entity='client.rgw.rgw.compute-0.zktslo' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Dec  6 04:41:56 np0005548916 ceph-mon[79770]: from='client.? 192.168.122.101:0/4120731466' entity='client.rgw.rgw.compute-1.oqhsdh' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Dec  6 04:41:56 np0005548916 ceph-mon[79770]: from='client.? 192.168.122.102:0/827372016' entity='client.rgw.rgw.compute-2.qizhkr' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Dec  6 04:41:56 np0005548916 ceph-mon[79770]: from='client.? ' entity='client.rgw.rgw.compute-1.oqhsdh' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Dec  6 04:41:56 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:56 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:56 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:56 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-0.ujokui", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch
Dec  6 04:41:56 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-0.ujokui", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished
Dec  6 04:41:56 np0005548916 ceph-mon[79770]: Deploying daemon mds.cephfs.compute-0.ujokui on compute-0
Dec  6 04:41:56 np0005548916 ceph-mon[79770]: from='client.? ' entity='client.rgw.rgw.compute-2.qizhkr' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]': finished
Dec  6 04:41:56 np0005548916 ceph-mon[79770]: from='client.? 192.168.122.100:0/1940551259' entity='client.rgw.rgw.compute-0.zktslo' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]': finished
Dec  6 04:41:56 np0005548916 ceph-mon[79770]: from='client.? ' entity='client.rgw.rgw.compute-1.oqhsdh' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]': finished
Dec  6 04:41:56 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).mds e3 new map
Dec  6 04:41:56 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).mds e3 print_map#012e3#012btime 2025-12-06T09:41:56:804272+0000#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0112#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112025-12-06T09:41:29.967778+0000#012modified#0112025-12-06T09:41:29.967778+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012max_mds#0111#012in#011#012up#011{}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0110#012qdb_cluster#011leader: 0 members: #012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-2.czucwy{-1:24274} state up:standby seq 1 addr [v2:192.168.122.102:6804/1500676117,v1:192.168.122.102:6805/1500676117] compat {c=[1],r=[1],i=[1fff]}]
Dec  6 04:41:56 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).mds e4 new map
Dec  6 04:41:56 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).mds e4 print_map#012e4#012btime 2025-12-06T09:41:56:835698+0000#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0114#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112025-12-06T09:41:29.967778+0000#012modified#0112025-12-06T09:41:56.835690+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012max_mds#0111#012in#0110#012up#011{0=24274}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0110#012qdb_cluster#011leader: 0 members: #012[mds.cephfs.compute-2.czucwy{0:24274} state up:creating seq 1 addr [v2:192.168.122.102:6804/1500676117,v1:192.168.122.102:6805/1500676117] compat {c=[1],r=[1],i=[1fff]}]#012 #012 
Dec  6 04:41:57 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e49 e49: 3 total, 3 up, 3 in
Dec  6 04:41:57 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"} v 0)
Dec  6 04:41:57 np0005548916 ceph-mon[79770]: log_channel(audit) log [INF] : from='client.? 192.168.122.101:0/4120731466' entity='client.rgw.rgw.compute-1.oqhsdh' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Dec  6 04:41:57 np0005548916 ceph-mon[79770]: daemon mds.cephfs.compute-2.czucwy assigned to filesystem cephfs as rank 0 (now has 1 ranks)
Dec  6 04:41:57 np0005548916 ceph-mon[79770]: Health check cleared: MDS_ALL_DOWN (was: 1 filesystem is offline)
Dec  6 04:41:57 np0005548916 ceph-mon[79770]: Health check cleared: MDS_UP_LESS_THAN_MAX (was: 1 filesystem is online with fewer MDS than max_mds)
Dec  6 04:41:57 np0005548916 ceph-mon[79770]: Cluster is now healthy
Dec  6 04:41:57 np0005548916 ceph-mon[79770]: daemon mds.cephfs.compute-2.czucwy is now active in filesystem cephfs as rank 0
Dec  6 04:41:57 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e49 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  6 04:41:57 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).mds e5 new map
Dec  6 04:41:57 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).mds e5 print_map#012e5#012btime 2025-12-06T09:41:57:856282+0000#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0115#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112025-12-06T09:41:29.967778+0000#012modified#0112025-12-06T09:41:57.856277+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012max_mds#0111#012in#0110#012up#011{0=24274}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0110#012qdb_cluster#011leader: 24274 members: 24274#012[mds.cephfs.compute-2.czucwy{0:24274} state up:active seq 2 addr [v2:192.168.122.102:6804/1500676117,v1:192.168.122.102:6805/1500676117] compat {c=[1],r=[1],i=[1fff]}]#012 #012 
Dec  6 04:41:58 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e50 e50: 3 total, 3 up, 3 in
Dec  6 04:41:58 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"} v 0)
Dec  6 04:41:58 np0005548916 ceph-mon[79770]: log_channel(audit) log [INF] : from='client.? 192.168.122.101:0/4120731466' entity='client.rgw.rgw.compute-1.oqhsdh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Dec  6 04:41:58 np0005548916 ceph-mon[79770]: from='client.? 192.168.122.101:0/4120731466' entity='client.rgw.rgw.compute-1.oqhsdh' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Dec  6 04:41:58 np0005548916 ceph-mon[79770]: from='client.? ' entity='client.rgw.rgw.compute-2.qizhkr' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Dec  6 04:41:58 np0005548916 ceph-mon[79770]: from='client.? ' entity='client.rgw.rgw.compute-1.oqhsdh' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Dec  6 04:41:58 np0005548916 ceph-mon[79770]: from='client.? 192.168.122.100:0/1940551259' entity='client.rgw.rgw.compute-0.zktslo' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Dec  6 04:41:58 np0005548916 ceph-mon[79770]: from='client.? 192.168.122.102:0/827372016' entity='client.rgw.rgw.compute-2.qizhkr' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Dec  6 04:41:58 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:58 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:58 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:58 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-1.fpvjgb", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch
Dec  6 04:41:58 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-1.fpvjgb", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished
Dec  6 04:41:58 np0005548916 ceph-mon[79770]: Deploying daemon mds.cephfs.compute-1.fpvjgb on compute-1
Dec  6 04:41:58 np0005548916 ceph-mon[79770]: from='client.? ' entity='client.rgw.rgw.compute-2.qizhkr' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]': finished
Dec  6 04:41:58 np0005548916 ceph-mon[79770]: from='client.? ' entity='client.rgw.rgw.compute-1.oqhsdh' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]': finished
Dec  6 04:41:58 np0005548916 ceph-mon[79770]: from='client.? 192.168.122.100:0/1940551259' entity='client.rgw.rgw.compute-0.zktslo' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]': finished
Dec  6 04:41:58 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).mds e6 new map
Dec  6 04:41:58 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).mds e6 print_map#012e6#012btime 2025-12-06T09:41:58:872230+0000#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0115#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112025-12-06T09:41:29.967778+0000#012modified#0112025-12-06T09:41:57.856277+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012max_mds#0111#012in#0110#012up#011{0=24274}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0110#012qdb_cluster#011leader: 24274 members: 24274#012[mds.cephfs.compute-2.czucwy{0:24274} state up:active seq 2 addr [v2:192.168.122.102:6804/1500676117,v1:192.168.122.102:6805/1500676117] compat {c=[1],r=[1],i=[1fff]}]#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-0.ujokui{-1:14544} state up:standby seq 1 addr [v2:192.168.122.100:6806/2465826838,v1:192.168.122.100:6807/2465826838] compat {c=[1],r=[1],i=[1fff]}]
Dec  6 04:41:58 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).mds e7 new map
Dec  6 04:41:58 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).mds e7 print_map#012e7#012btime 2025-12-06T09:41:58:889029+0000#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0115#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112025-12-06T09:41:29.967778+0000#012modified#0112025-12-06T09:41:57.856277+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012max_mds#0111#012in#0110#012up#011{0=24274}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0111#012qdb_cluster#011leader: 24274 members: 24274#012[mds.cephfs.compute-2.czucwy{0:24274} state up:active seq 2 addr [v2:192.168.122.102:6804/1500676117,v1:192.168.122.102:6805/1500676117] compat {c=[1],r=[1],i=[1fff]}]#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-0.ujokui{-1:14544} state up:standby seq 1 addr [v2:192.168.122.100:6806/2465826838,v1:192.168.122.100:6807/2465826838] compat {c=[1],r=[1],i=[1fff]}]
Dec  6 04:41:59 np0005548916 podman[84032]: 2025-12-06 09:41:59.136640873 +0000 UTC m=+0.059561972 container create b38516c0d5f3f2df479ab73c1fd28dc8b20c7fa3feacf446ae058e7b664cd616 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=magical_nash, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True)
Dec  6 04:41:59 np0005548916 systemd[1]: Started libpod-conmon-b38516c0d5f3f2df479ab73c1fd28dc8b20c7fa3feacf446ae058e7b664cd616.scope.
Dec  6 04:41:59 np0005548916 podman[84032]: 2025-12-06 09:41:59.106892878 +0000 UTC m=+0.029814017 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  6 04:41:59 np0005548916 systemd[1]: Started libcrun container.
Dec  6 04:41:59 np0005548916 podman[84032]: 2025-12-06 09:41:59.239815633 +0000 UTC m=+0.162736762 container init b38516c0d5f3f2df479ab73c1fd28dc8b20c7fa3feacf446ae058e7b664cd616 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=magical_nash, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec  6 04:41:59 np0005548916 podman[84032]: 2025-12-06 09:41:59.253012351 +0000 UTC m=+0.175933450 container start b38516c0d5f3f2df479ab73c1fd28dc8b20c7fa3feacf446ae058e7b664cd616 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=magical_nash, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True)
Dec  6 04:41:59 np0005548916 podman[84032]: 2025-12-06 09:41:59.256922843 +0000 UTC m=+0.179843942 container attach b38516c0d5f3f2df479ab73c1fd28dc8b20c7fa3feacf446ae058e7b664cd616 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=magical_nash, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  6 04:41:59 np0005548916 magical_nash[84048]: 167 167
Dec  6 04:41:59 np0005548916 systemd[1]: libpod-b38516c0d5f3f2df479ab73c1fd28dc8b20c7fa3feacf446ae058e7b664cd616.scope: Deactivated successfully.
Dec  6 04:41:59 np0005548916 podman[84032]: 2025-12-06 09:41:59.260822954 +0000 UTC m=+0.183744053 container died b38516c0d5f3f2df479ab73c1fd28dc8b20c7fa3feacf446ae058e7b664cd616 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=magical_nash, ceph=True, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325)
Dec  6 04:41:59 np0005548916 systemd[1]: var-lib-containers-storage-overlay-13bd6873fe53a8616bb0eae094bc00bb67d033bb8f1e78b02206f256dbcfd3e5-merged.mount: Deactivated successfully.
Dec  6 04:41:59 np0005548916 podman[84032]: 2025-12-06 09:41:59.316476164 +0000 UTC m=+0.239397283 container remove b38516c0d5f3f2df479ab73c1fd28dc8b20c7fa3feacf446ae058e7b664cd616 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=magical_nash, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  6 04:41:59 np0005548916 systemd[1]: libpod-conmon-b38516c0d5f3f2df479ab73c1fd28dc8b20c7fa3feacf446ae058e7b664cd616.scope: Deactivated successfully.
Dec  6 04:41:59 np0005548916 systemd[1]: Reloading.
Dec  6 04:41:59 np0005548916 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:41:59 np0005548916 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:41:59 np0005548916 systemd[1]: Reloading.
Dec  6 04:41:59 np0005548916 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:41:59 np0005548916 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:41:59 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e51 e51: 3 total, 3 up, 3 in
Dec  6 04:41:59 np0005548916 ceph-mon[79770]: from='client.? 192.168.122.100:0/1940551259' entity='client.rgw.rgw.compute-0.zktslo' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Dec  6 04:41:59 np0005548916 ceph-mon[79770]: from='client.? ' entity='client.rgw.rgw.compute-2.qizhkr' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Dec  6 04:41:59 np0005548916 ceph-mon[79770]: from='client.? 192.168.122.101:0/4120731466' entity='client.rgw.rgw.compute-1.oqhsdh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Dec  6 04:41:59 np0005548916 ceph-mon[79770]: from='client.? 192.168.122.102:0/827372016' entity='client.rgw.rgw.compute-2.qizhkr' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Dec  6 04:41:59 np0005548916 ceph-mon[79770]: from='client.? ' entity='client.rgw.rgw.compute-1.oqhsdh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Dec  6 04:41:59 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:59 np0005548916 ceph-mon[79770]: from='client.? 192.168.122.100:0/1940551259' entity='client.rgw.rgw.compute-0.zktslo' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]': finished
Dec  6 04:41:59 np0005548916 ceph-mon[79770]: from='client.? ' entity='client.rgw.rgw.compute-2.qizhkr' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]': finished
Dec  6 04:41:59 np0005548916 ceph-mon[79770]: from='client.? ' entity='client.rgw.rgw.compute-1.oqhsdh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]': finished
Dec  6 04:41:59 np0005548916 systemd[1]: Starting Ceph mds.cephfs.compute-1.fpvjgb for 5ecd3f74-dade-5fc4-92ce-8950ae424258...
Dec  6 04:42:00 np0005548916 radosgw[83354]: v1 topic migration: starting v1 topic migration..
Dec  6 04:42:00 np0005548916 radosgw[83354]: LDAP not started since no server URIs were provided in the configuration.
Dec  6 04:42:00 np0005548916 radosgw[83354]: v1 topic migration: finished v1 topic migration
Dec  6 04:42:00 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-rgw-rgw-compute-1-oqhsdh[83350]: 2025-12-06T09:42:00.169+0000 7f396e20b980 -1 LDAP not started since no server URIs were provided in the configuration.
Dec  6 04:42:00 np0005548916 radosgw[83354]: framework: beast
Dec  6 04:42:00 np0005548916 radosgw[83354]: framework conf key: ssl_certificate, val: config://rgw/cert/$realm/$zone.crt
Dec  6 04:42:00 np0005548916 radosgw[83354]: framework conf key: ssl_private_key, val: config://rgw/cert/$realm/$zone.key
Dec  6 04:42:00 np0005548916 radosgw[83354]: starting handler: beast
Dec  6 04:42:00 np0005548916 radosgw[83354]: set uid:gid to 167:167 (ceph:ceph)
Dec  6 04:42:00 np0005548916 radosgw[83354]: mgrc service_daemon_register rgw.24197 metadata {arch=x86_64,ceph_release=squid,ceph_version=ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable),ceph_version_short=19.2.3,container_hostname=compute-1,container_image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec,cpu=AMD EPYC-Rome Processor,distro=centos,distro_description=CentOS Stream 9,distro_version=9,frontend_config#0=beast endpoint=192.168.122.101:8082,frontend_type#0=beast,hostname=compute-1,id=rgw.compute-1.oqhsdh,kernel_description=#1 SMP PREEMPT_DYNAMIC Fri Nov 28 14:01:17 UTC 2025,kernel_version=5.14.0-645.el9.x86_64,mem_swap_kb=1048572,mem_total_kb=7864312,num_handles=1,os=Linux,pid=2,realm_id=,realm_name=,zone_id=d81f60a3-cfd4-40b3-a809-ad3aae1b1fd0,zone_name=default,zonegroup_id=75773215-ab74-4afd-a4c0-f777a01e4a1a,zonegroup_name=default}
Dec  6 04:42:00 np0005548916 podman[84221]: 2025-12-06 09:42:00.558206777 +0000 UTC m=+0.044856318 container create 17b03b2bf6f0162831451ffcdd012066b2bc55aad88c494025779ae0bd1c353e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mds-cephfs-compute-1-fpvjgb, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec  6 04:42:00 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/aea48407195da003f757e5ee98f66cd59111179655f3523b76cbd279f1cde646/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  6 04:42:00 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/aea48407195da003f757e5ee98f66cd59111179655f3523b76cbd279f1cde646/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  6 04:42:00 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/aea48407195da003f757e5ee98f66cd59111179655f3523b76cbd279f1cde646/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  6 04:42:00 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/aea48407195da003f757e5ee98f66cd59111179655f3523b76cbd279f1cde646/merged/var/lib/ceph/mds/ceph-cephfs.compute-1.fpvjgb supports timestamps until 2038 (0x7fffffff)
Dec  6 04:42:00 np0005548916 podman[84221]: 2025-12-06 09:42:00.622881928 +0000 UTC m=+0.109531489 container init 17b03b2bf6f0162831451ffcdd012066b2bc55aad88c494025779ae0bd1c353e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mds-cephfs-compute-1-fpvjgb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_REF=squid)
Dec  6 04:42:00 np0005548916 podman[84221]: 2025-12-06 09:42:00.629245737 +0000 UTC m=+0.115895278 container start 17b03b2bf6f0162831451ffcdd012066b2bc55aad88c494025779ae0bd1c353e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mds-cephfs-compute-1-fpvjgb, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid)
Dec  6 04:42:00 np0005548916 bash[84221]: 17b03b2bf6f0162831451ffcdd012066b2bc55aad88c494025779ae0bd1c353e
Dec  6 04:42:00 np0005548916 podman[84221]: 2025-12-06 09:42:00.538114138 +0000 UTC m=+0.024763709 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  6 04:42:00 np0005548916 systemd[1]: Started Ceph mds.cephfs.compute-1.fpvjgb for 5ecd3f74-dade-5fc4-92ce-8950ae424258.
Dec  6 04:42:00 np0005548916 ceph-mds[84241]: set uid:gid to 167:167 (ceph:ceph)
Dec  6 04:42:00 np0005548916 ceph-mds[84241]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mds, pid 2
Dec  6 04:42:00 np0005548916 ceph-mds[84241]: main not setting numa affinity
Dec  6 04:42:00 np0005548916 ceph-mds[84241]: pidfile_write: ignore empty --pid-file
Dec  6 04:42:00 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mds-cephfs-compute-1-fpvjgb[84237]: starting mds.cephfs.compute-1.fpvjgb at 
Dec  6 04:42:00 np0005548916 ceph-mds[84241]: mds.cephfs.compute-1.fpvjgb Updating MDS map to version 7 from mon.2
Dec  6 04:42:00 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).mds e8 new map
Dec  6 04:42:00 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).mds e8 print_map#012e8#012btime 2025-12-06T09:42:00:908587+0000#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0118#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112025-12-06T09:41:29.967778+0000#012modified#0112025-12-06T09:42:00.880325+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012max_mds#0111#012in#0110#012up#011{0=24274}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0111#012qdb_cluster#011leader: 24274 members: 24274#012[mds.cephfs.compute-2.czucwy{0:24274} state up:active seq 3 join_fscid=1 addr [v2:192.168.122.102:6804/1500676117,v1:192.168.122.102:6805/1500676117] compat {c=[1],r=[1],i=[1fff]}]#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-0.ujokui{-1:14544} state up:standby seq 1 addr [v2:192.168.122.100:6806/2465826838,v1:192.168.122.100:6807/2465826838] compat {c=[1],r=[1],i=[1fff]}]#012[mds.cephfs.compute-1.fpvjgb{-1:24215} state up:standby seq 1 addr [v2:192.168.122.101:6804/2619956440,v1:192.168.122.101:6805/2619956440] compat {c=[1],r=[1],i=[1fff]}]
Dec  6 04:42:00 np0005548916 ceph-mds[84241]: mds.cephfs.compute-1.fpvjgb Updating MDS map to version 8 from mon.2
Dec  6 04:42:00 np0005548916 ceph-mds[84241]: mds.cephfs.compute-1.fpvjgb Monitors have assigned me to become a standby
Dec  6 04:42:01 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:42:01 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:42:01 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:42:01 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:42:01 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:42:01 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:42:01 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.0.0.compute-1.djsnbu", "caps": ["mon", "allow r", "osd", "allow rw pool=.nfs namespace=cephfs"]}]: dispatch
Dec  6 04:42:01 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.0.0.compute-1.djsnbu", "caps": ["mon", "allow r", "osd", "allow rw pool=.nfs namespace=cephfs"]}]': finished
Dec  6 04:42:01 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get-or-create", "entity": "client.mgr.nfs.grace.nfs.cephfs", "caps": ["mon", "allow r", "osd", "allow rwx pool .nfs"]}]: dispatch
Dec  6 04:42:01 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "auth get-or-create", "entity": "client.mgr.nfs.grace.nfs.cephfs", "caps": ["mon", "allow r", "osd", "allow rwx pool .nfs"]}]': finished
Dec  6 04:42:01 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth rm", "entity": "client.mgr.nfs.grace.nfs.cephfs"}]: dispatch
Dec  6 04:42:01 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "auth rm", "entity": "client.mgr.nfs.grace.nfs.cephfs"}]': finished
Dec  6 04:42:01 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.0.0.compute-1.djsnbu-rgw", "caps": ["mon", "allow r", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Dec  6 04:42:01 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.0.0.compute-1.djsnbu-rgw", "caps": ["mon", "allow r", "osd", "allow rwx tag rgw *=*"]}]': finished
Dec  6 04:42:01 np0005548916 podman[84349]: 2025-12-06 09:42:01.769710195 +0000 UTC m=+0.055684152 container create 16e407b4e4845d7f6c5384028459f391a49c74c6f25c684ff10a6cc61cfd24b2 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=zen_goldwasser, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Dec  6 04:42:01 np0005548916 systemd[1]: Started libpod-conmon-16e407b4e4845d7f6c5384028459f391a49c74c6f25c684ff10a6cc61cfd24b2.scope.
Dec  6 04:42:01 np0005548916 podman[84349]: 2025-12-06 09:42:01.744770283 +0000 UTC m=+0.030744270 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  6 04:42:01 np0005548916 systemd[1]: Started libcrun container.
Dec  6 04:42:01 np0005548916 podman[84349]: 2025-12-06 09:42:01.87782082 +0000 UTC m=+0.163794797 container init 16e407b4e4845d7f6c5384028459f391a49c74c6f25c684ff10a6cc61cfd24b2 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=zen_goldwasser, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, ceph=True, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  6 04:42:01 np0005548916 podman[84349]: 2025-12-06 09:42:01.885806087 +0000 UTC m=+0.171780044 container start 16e407b4e4845d7f6c5384028459f391a49c74c6f25c684ff10a6cc61cfd24b2 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=zen_goldwasser, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.build-date=20250325, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0)
Dec  6 04:42:01 np0005548916 podman[84349]: 2025-12-06 09:42:01.889062813 +0000 UTC m=+0.175036770 container attach 16e407b4e4845d7f6c5384028459f391a49c74c6f25c684ff10a6cc61cfd24b2 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=zen_goldwasser, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec  6 04:42:01 np0005548916 systemd[1]: libpod-16e407b4e4845d7f6c5384028459f391a49c74c6f25c684ff10a6cc61cfd24b2.scope: Deactivated successfully.
Dec  6 04:42:01 np0005548916 zen_goldwasser[84366]: 167 167
Dec  6 04:42:01 np0005548916 conmon[84366]: conmon 16e407b4e4845d7f6c53 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-16e407b4e4845d7f6c5384028459f391a49c74c6f25c684ff10a6cc61cfd24b2.scope/container/memory.events
Dec  6 04:42:01 np0005548916 podman[84349]: 2025-12-06 09:42:01.894830438 +0000 UTC m=+0.180804395 container died 16e407b4e4845d7f6c5384028459f391a49c74c6f25c684ff10a6cc61cfd24b2 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=zen_goldwasser, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0)
Dec  6 04:42:01 np0005548916 systemd[1]: var-lib-containers-storage-overlay-9f8c89d08f4caf7f1c2d0bba3afec1deda6f360e555a94df8ec3a28965ea630b-merged.mount: Deactivated successfully.
Dec  6 04:42:01 np0005548916 podman[84349]: 2025-12-06 09:42:01.94032549 +0000 UTC m=+0.226299457 container remove 16e407b4e4845d7f6c5384028459f391a49c74c6f25c684ff10a6cc61cfd24b2 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=zen_goldwasser, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  6 04:42:01 np0005548916 systemd[1]: libpod-conmon-16e407b4e4845d7f6c5384028459f391a49c74c6f25c684ff10a6cc61cfd24b2.scope: Deactivated successfully.
Dec  6 04:42:01 np0005548916 systemd[1]: Reloading.
Dec  6 04:42:02 np0005548916 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:42:02 np0005548916 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:42:02 np0005548916 systemd[1]: Reloading.
Dec  6 04:42:02 np0005548916 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:42:02 np0005548916 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:42:02 np0005548916 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.djsnbu for 5ecd3f74-dade-5fc4-92ce-8950ae424258...
Dec  6 04:42:02 np0005548916 ceph-mon[79770]: Creating key for client.nfs.cephfs.0.0.compute-1.djsnbu
Dec  6 04:42:02 np0005548916 ceph-mon[79770]: Ensuring nfs.cephfs.0 is in the ganesha grace table
Dec  6 04:42:02 np0005548916 ceph-mon[79770]: Rados config object exists: conf-nfs.cephfs
Dec  6 04:42:02 np0005548916 ceph-mon[79770]: Creating key for client.nfs.cephfs.0.0.compute-1.djsnbu-rgw
Dec  6 04:42:02 np0005548916 ceph-mon[79770]: Bind address in nfs.cephfs.0.0.compute-1.djsnbu's ganesha conf is defaulting to empty
Dec  6 04:42:02 np0005548916 ceph-mon[79770]: Deploying daemon nfs.cephfs.0.0.compute-1.djsnbu on compute-1
Dec  6 04:42:02 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e51 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  6 04:42:02 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).mds e9 new map
Dec  6 04:42:02 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).mds e9 print_map#012e9#012btime 2025-12-06T09:42:02:933823+0000#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0118#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112025-12-06T09:41:29.967778+0000#012modified#0112025-12-06T09:42:00.880325+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012max_mds#0111#012in#0110#012up#011{0=24274}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0111#012qdb_cluster#011leader: 24274 members: 24274#012[mds.cephfs.compute-2.czucwy{0:24274} state up:active seq 3 join_fscid=1 addr [v2:192.168.122.102:6804/1500676117,v1:192.168.122.102:6805/1500676117] compat {c=[1],r=[1],i=[1fff]}]#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-0.ujokui{-1:14544} state up:standby seq 2 join_fscid=1 addr [v2:192.168.122.100:6806/2465826838,v1:192.168.122.100:6807/2465826838] compat {c=[1],r=[1],i=[1fff]}]#012[mds.cephfs.compute-1.fpvjgb{-1:24215} state up:standby seq 1 addr [v2:192.168.122.101:6804/2619956440,v1:192.168.122.101:6805/2619956440] compat {c=[1],r=[1],i=[1fff]}]
Dec  6 04:42:03 np0005548916 podman[84507]: 2025-12-06 09:42:03.000762058 +0000 UTC m=+0.057955535 container create 2b1801986393e8e2cbe7b4cdadc22f24012f42b9768a29cb7ee64c55eabe33b4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, OSD_FLAVOR=default, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  6 04:42:03 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fd7e7e9e4ddac3e74b3b7bc6b20dd5bb2fcc490030f679e68f53a0a8ada38ac6/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Dec  6 04:42:03 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fd7e7e9e4ddac3e74b3b7bc6b20dd5bb2fcc490030f679e68f53a0a8ada38ac6/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  6 04:42:03 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fd7e7e9e4ddac3e74b3b7bc6b20dd5bb2fcc490030f679e68f53a0a8ada38ac6/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Dec  6 04:42:03 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fd7e7e9e4ddac3e74b3b7bc6b20dd5bb2fcc490030f679e68f53a0a8ada38ac6/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.djsnbu-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Dec  6 04:42:03 np0005548916 podman[84507]: 2025-12-06 09:42:02.977326091 +0000 UTC m=+0.034519568 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  6 04:42:03 np0005548916 podman[84507]: 2025-12-06 09:42:03.072685202 +0000 UTC m=+0.129878719 container init 2b1801986393e8e2cbe7b4cdadc22f24012f42b9768a29cb7ee64c55eabe33b4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Dec  6 04:42:03 np0005548916 podman[84507]: 2025-12-06 09:42:03.08177002 +0000 UTC m=+0.138963497 container start 2b1801986393e8e2cbe7b4cdadc22f24012f42b9768a29cb7ee64c55eabe33b4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Dec  6 04:42:03 np0005548916 bash[84507]: 2b1801986393e8e2cbe7b4cdadc22f24012f42b9768a29cb7ee64c55eabe33b4
Dec  6 04:42:03 np0005548916 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.djsnbu for 5ecd3f74-dade-5fc4-92ce-8950ae424258.
Dec  6 04:42:03 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:03 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Dec  6 04:42:03 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:03 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Dec  6 04:42:03 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:03 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Dec  6 04:42:03 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:03 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Dec  6 04:42:03 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:03 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Dec  6 04:42:03 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:03 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Dec  6 04:42:03 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:03 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Dec  6 04:42:03 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:03 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 04:42:03 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:03 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[main] rados_kv_traverse :CLIENT ID :EVENT :Failed to lst kv ret=-2
Dec  6 04:42:03 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:03 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[main] rados_cluster_read_clids :CLIENT ID :EVENT :Failed to traverse recovery db: -2
Dec  6 04:42:03 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:03 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 04:42:03 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:03 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 04:42:03 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:42:03 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:42:03 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:42:03 np0005548916 ceph-mon[79770]: Creating key for client.nfs.cephfs.1.0.compute-2.sseuqb
Dec  6 04:42:03 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.1.0.compute-2.sseuqb", "caps": ["mon", "allow r", "osd", "allow rw pool=.nfs namespace=cephfs"]}]: dispatch
Dec  6 04:42:03 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.1.0.compute-2.sseuqb", "caps": ["mon", "allow r", "osd", "allow rw pool=.nfs namespace=cephfs"]}]': finished
Dec  6 04:42:03 np0005548916 ceph-mon[79770]: Ensuring nfs.cephfs.1 is in the ganesha grace table
Dec  6 04:42:03 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get-or-create", "entity": "client.mgr.nfs.grace.nfs.cephfs", "caps": ["mon", "allow r", "osd", "allow rwx pool .nfs"]}]: dispatch
Dec  6 04:42:03 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "auth get-or-create", "entity": "client.mgr.nfs.grace.nfs.cephfs", "caps": ["mon", "allow r", "osd", "allow rwx pool .nfs"]}]': finished
Dec  6 04:42:05 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:42:05 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).mds e10 new map
Dec  6 04:42:05 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).mds e10 print_map#012e10#012btime 2025-12-06T09:42:05:044345+0000#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0118#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112025-12-06T09:41:29.967778+0000#012modified#0112025-12-06T09:42:00.880325+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012max_mds#0111#012in#0110#012up#011{0=24274}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0111#012qdb_cluster#011leader: 24274 members: 24274#012[mds.cephfs.compute-2.czucwy{0:24274} state up:active seq 3 join_fscid=1 addr [v2:192.168.122.102:6804/1500676117,v1:192.168.122.102:6805/1500676117] compat {c=[1],r=[1],i=[1fff]}]#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-0.ujokui{-1:14544} state up:standby seq 2 join_fscid=1 addr [v2:192.168.122.100:6806/2465826838,v1:192.168.122.100:6807/2465826838] compat {c=[1],r=[1],i=[1fff]}]#012[mds.cephfs.compute-1.fpvjgb{-1:24215} state up:standby seq 2 join_fscid=1 addr [v2:192.168.122.101:6804/2619956440,v1:192.168.122.101:6805/2619956440] compat {c=[1],r=[1],i=[1fff]}]
Dec  6 04:42:05 np0005548916 ceph-mds[84241]: mds.cephfs.compute-1.fpvjgb Updating MDS map to version 10 from mon.2
Dec  6 04:42:06 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:06 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[main] rados_cluster_end_grace :CLIENT ID :EVENT :Failed to remove rec-0000000000000001:nfs.cephfs.0: -2
Dec  6 04:42:06 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:06 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  6 04:42:06 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:06 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Dec  6 04:42:06 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:06 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Dec  6 04:42:06 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:06 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Dec  6 04:42:06 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:06 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Dec  6 04:42:06 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:06 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Dec  6 04:42:06 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:06 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Dec  6 04:42:06 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:06 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  6 04:42:06 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:06 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  6 04:42:06 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:06 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  6 04:42:06 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:06 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Dec  6 04:42:06 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:06 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  6 04:42:06 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:06 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Dec  6 04:42:06 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:06 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Dec  6 04:42:06 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:06 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Dec  6 04:42:06 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:06 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Dec  6 04:42:06 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:06 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Dec  6 04:42:06 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:06 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Dec  6 04:42:06 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:06 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Dec  6 04:42:06 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:06 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Dec  6 04:42:06 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:06 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Dec  6 04:42:06 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:06 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Dec  6 04:42:06 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:06 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Dec  6 04:42:06 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:06 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Dec  6 04:42:06 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:06 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec  6 04:42:06 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:06 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Dec  6 04:42:06 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:06 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec  6 04:42:07 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth rm", "entity": "client.mgr.nfs.grace.nfs.cephfs"}]: dispatch
Dec  6 04:42:07 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "auth rm", "entity": "client.mgr.nfs.grace.nfs.cephfs"}]': finished
Dec  6 04:42:07 np0005548916 ceph-mon[79770]: Rados config object exists: conf-nfs.cephfs
Dec  6 04:42:07 np0005548916 ceph-mon[79770]: Creating key for client.nfs.cephfs.1.0.compute-2.sseuqb-rgw
Dec  6 04:42:07 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.1.0.compute-2.sseuqb-rgw", "caps": ["mon", "allow r", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Dec  6 04:42:07 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.1.0.compute-2.sseuqb-rgw", "caps": ["mon", "allow r", "osd", "allow rwx tag rgw *=*"]}]': finished
Dec  6 04:42:07 np0005548916 ceph-mon[79770]: Bind address in nfs.cephfs.1.0.compute-2.sseuqb's ganesha conf is defaulting to empty
Dec  6 04:42:07 np0005548916 ceph-mon[79770]: Deploying daemon nfs.cephfs.1.0.compute-2.sseuqb on compute-2
Dec  6 04:42:07 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e51 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  6 04:42:08 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:08 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 04:42:08 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:08 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 04:42:08 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:08 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 04:42:08 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:08 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 04:42:08 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:08 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  6 04:42:08 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:42:08 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:42:08 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:42:08 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.2.0.compute-0.dfwxck", "caps": ["mon", "allow r", "osd", "allow rw pool=.nfs namespace=cephfs"]}]: dispatch
Dec  6 04:42:08 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.2.0.compute-0.dfwxck", "caps": ["mon", "allow r", "osd", "allow rw pool=.nfs namespace=cephfs"]}]': finished
Dec  6 04:42:08 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get-or-create", "entity": "client.mgr.nfs.grace.nfs.cephfs", "caps": ["mon", "allow r", "osd", "allow rwx pool .nfs"]}]: dispatch
Dec  6 04:42:08 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "auth get-or-create", "entity": "client.mgr.nfs.grace.nfs.cephfs", "caps": ["mon", "allow r", "osd", "allow rwx pool .nfs"]}]': finished
Dec  6 04:42:09 np0005548916 ceph-mon[79770]: Creating key for client.nfs.cephfs.2.0.compute-0.dfwxck
Dec  6 04:42:09 np0005548916 ceph-mon[79770]: Ensuring nfs.cephfs.2 is in the ganesha grace table
Dec  6 04:42:09 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth rm", "entity": "client.mgr.nfs.grace.nfs.cephfs"}]: dispatch
Dec  6 04:42:09 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "auth rm", "entity": "client.mgr.nfs.grace.nfs.cephfs"}]': finished
Dec  6 04:42:09 np0005548916 ceph-mon[79770]: Rados config object exists: conf-nfs.cephfs
Dec  6 04:42:09 np0005548916 ceph-mon[79770]: Creating key for client.nfs.cephfs.2.0.compute-0.dfwxck-rgw
Dec  6 04:42:09 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.2.0.compute-0.dfwxck-rgw", "caps": ["mon", "allow r", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Dec  6 04:42:09 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.2.0.compute-0.dfwxck-rgw", "caps": ["mon", "allow r", "osd", "allow rwx tag rgw *=*"]}]': finished
Dec  6 04:42:09 np0005548916 ceph-mon[79770]: Bind address in nfs.cephfs.2.0.compute-0.dfwxck's ganesha conf is defaulting to empty
Dec  6 04:42:09 np0005548916 ceph-mon[79770]: Deploying daemon nfs.cephfs.2.0.compute-0.dfwxck on compute-0
Dec  6 04:42:09 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:42:10 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:10 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 04:42:10 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:42:10 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:42:10 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:10 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 04:42:10 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:10 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 04:42:10 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:10 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 04:42:10 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:10 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  6 04:42:11 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:42:11 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:42:11 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:42:11 np0005548916 ceph-mon[79770]: Deploying daemon haproxy.nfs.cephfs.compute-1.jmdafd on compute-1
Dec  6 04:42:12 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e51 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  6 04:42:15 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:42:16 np0005548916 podman[84668]: 2025-12-06 09:42:16.067994466 +0000 UTC m=+4.788015484 container create 18e65341a7952f39d21229343d78490f489e269c94cfb809cd6cb5b054a35b60 (image=quay.io/ceph/haproxy:2.3, name=blissful_edison)
Dec  6 04:42:16 np0005548916 podman[84668]: 2025-12-06 09:42:16.041916171 +0000 UTC m=+4.761937229 image pull e85424b0d443f37ddd2dd8a3bb2ef6f18dd352b987723a921b64289023af2914 quay.io/ceph/haproxy:2.3
Dec  6 04:42:16 np0005548916 systemd[1]: Started libpod-conmon-18e65341a7952f39d21229343d78490f489e269c94cfb809cd6cb5b054a35b60.scope.
Dec  6 04:42:16 np0005548916 systemd[1]: Started libcrun container.
Dec  6 04:42:16 np0005548916 podman[84668]: 2025-12-06 09:42:16.165071931 +0000 UTC m=+4.885092939 container init 18e65341a7952f39d21229343d78490f489e269c94cfb809cd6cb5b054a35b60 (image=quay.io/ceph/haproxy:2.3, name=blissful_edison)
Dec  6 04:42:16 np0005548916 podman[84668]: 2025-12-06 09:42:16.174844625 +0000 UTC m=+4.894865633 container start 18e65341a7952f39d21229343d78490f489e269c94cfb809cd6cb5b054a35b60 (image=quay.io/ceph/haproxy:2.3, name=blissful_edison)
Dec  6 04:42:16 np0005548916 podman[84668]: 2025-12-06 09:42:16.178564464 +0000 UTC m=+4.898585472 container attach 18e65341a7952f39d21229343d78490f489e269c94cfb809cd6cb5b054a35b60 (image=quay.io/ceph/haproxy:2.3, name=blissful_edison)
Dec  6 04:42:16 np0005548916 blissful_edison[84789]: 0 0
Dec  6 04:42:16 np0005548916 systemd[1]: libpod-18e65341a7952f39d21229343d78490f489e269c94cfb809cd6cb5b054a35b60.scope: Deactivated successfully.
Dec  6 04:42:16 np0005548916 podman[84668]: 2025-12-06 09:42:16.18422647 +0000 UTC m=+4.904247478 container died 18e65341a7952f39d21229343d78490f489e269c94cfb809cd6cb5b054a35b60 (image=quay.io/ceph/haproxy:2.3, name=blissful_edison)
Dec  6 04:42:16 np0005548916 systemd[1]: var-lib-containers-storage-overlay-ad28b025bb456f81cdb6f60f7826e74df3cc72a199b13e204b08a3e31a602e1e-merged.mount: Deactivated successfully.
Dec  6 04:42:16 np0005548916 podman[84668]: 2025-12-06 09:42:16.233819868 +0000 UTC m=+4.953840876 container remove 18e65341a7952f39d21229343d78490f489e269c94cfb809cd6cb5b054a35b60 (image=quay.io/ceph/haproxy:2.3, name=blissful_edison)
Dec  6 04:42:16 np0005548916 systemd[1]: libpod-conmon-18e65341a7952f39d21229343d78490f489e269c94cfb809cd6cb5b054a35b60.scope: Deactivated successfully.
Dec  6 04:42:16 np0005548916 systemd[1]: Reloading.
Dec  6 04:42:16 np0005548916 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:42:16 np0005548916 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:42:16 np0005548916 systemd[1]: Reloading.
Dec  6 04:42:16 np0005548916 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:42:16 np0005548916 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:42:16 np0005548916 systemd[1]: Starting Ceph haproxy.nfs.cephfs.compute-1.jmdafd for 5ecd3f74-dade-5fc4-92ce-8950ae424258...
Dec  6 04:42:17 np0005548916 podman[84935]: 2025-12-06 09:42:17.1299086 +0000 UTC m=+0.042023998 container create 70891cd2190622057f9c45299e27938f7b2105f0244eda3658dedfb18fed50f0 (image=quay.io/ceph/haproxy:2.3, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd)
Dec  6 04:42:17 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2e32abe081b2708d90d7e7598309a913b00ba5c2a87ffd8d8b498ae51bb15565/merged/var/lib/haproxy supports timestamps until 2038 (0x7fffffff)
Dec  6 04:42:17 np0005548916 podman[84935]: 2025-12-06 09:42:17.20088771 +0000 UTC m=+0.113003198 container init 70891cd2190622057f9c45299e27938f7b2105f0244eda3658dedfb18fed50f0 (image=quay.io/ceph/haproxy:2.3, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd)
Dec  6 04:42:17 np0005548916 podman[84935]: 2025-12-06 09:42:17.111049128 +0000 UTC m=+0.023164546 image pull e85424b0d443f37ddd2dd8a3bb2ef6f18dd352b987723a921b64289023af2914 quay.io/ceph/haproxy:2.3
Dec  6 04:42:17 np0005548916 podman[84935]: 2025-12-06 09:42:17.210452989 +0000 UTC m=+0.122568417 container start 70891cd2190622057f9c45299e27938f7b2105f0244eda3658dedfb18fed50f0 (image=quay.io/ceph/haproxy:2.3, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd)
Dec  6 04:42:17 np0005548916 bash[84935]: 70891cd2190622057f9c45299e27938f7b2105f0244eda3658dedfb18fed50f0
Dec  6 04:42:17 np0005548916 systemd[1]: Started Ceph haproxy.nfs.cephfs.compute-1.jmdafd for 5ecd3f74-dade-5fc4-92ce-8950ae424258.
Dec  6 04:42:17 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [NOTICE] 339/094217 (2) : New worker #1 (4) forked
Dec  6 04:42:17 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:17 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb24000df0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:42:17 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e51 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  6 04:42:18 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:42:18 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:42:18 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:42:18 np0005548916 ceph-mon[79770]: Deploying daemon haproxy.nfs.cephfs.compute-0.fzuvue on compute-0
Dec  6 04:42:19 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:19 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb18001c00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:42:21 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:21 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feafc000b60 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:42:22 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:22 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb20001110 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:42:22 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:42:22 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:42:22 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e51 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  6 04:42:23 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:23 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb00000fa0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:42:23 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:42:23 np0005548916 ceph-mon[79770]: Deploying daemon haproxy.nfs.cephfs.compute-2.voodna on compute-2
Dec  6 04:42:24 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:24 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb18002520 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:42:25 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:25 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feafc0016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:42:26 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:26 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb20001eb0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:42:27 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:27 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb00001ac0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:42:27 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:27 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feafc0016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:42:27 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e51 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  6 04:42:28 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:42:28 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:42:28 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:42:28 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:42:28 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:28 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb18002520 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:42:29 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:29 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb20001eb0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:42:29 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e52 e52: 3 total, 3 up, 3 in
Dec  6 04:42:29 np0005548916 ceph-mon[79770]: 192.168.122.2 is in 192.168.122.0/24 on compute-1 interface br-ex
Dec  6 04:42:29 np0005548916 ceph-mon[79770]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Dec  6 04:42:29 np0005548916 ceph-mon[79770]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Dec  6 04:42:29 np0005548916 ceph-mon[79770]: Deploying daemon keepalived.nfs.cephfs.compute-1.uzbtlt on compute-1
Dec  6 04:42:29 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num", "val": "16"}]: dispatch
Dec  6 04:42:29 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:29 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb00001ac0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:42:30 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num", "val": "16"}]': finished
Dec  6 04:42:30 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num", "val": "32"}]: dispatch
Dec  6 04:42:30 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num", "val": "32"}]': finished
Dec  6 04:42:30 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e53 e53: 3 total, 3 up, 3 in
Dec  6 04:42:30 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:30 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feafc0016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:42:31 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:31 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb18002520 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:42:31 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e54 e54: 3 total, 3 up, 3 in
Dec  6 04:42:31 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 54 pg[6.0( v 50'39 (0'0,50'39] local-lis/les=21/23 n=22 ec=21/21 lis/c=21/21 les/c/f=23/23/0 sis=54 pruub=13.692863464s) [0] r=0 lpr=54 pi=[21,54)/1 crt=50'39 lcod 48'38 mlcod 48'38 active pruub 189.167221069s@ mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [0], acting_primary 0 -> 0, up_primary 0 -> 0, role 0 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:31 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 54 pg[6.0( v 50'39 lc 0'0 (0'0,50'39] local-lis/les=21/23 n=1 ec=21/21 lis/c=21/21 les/c/f=23/23/0 sis=54 pruub=13.692863464s) [0] r=0 lpr=54 pi=[21,54)/1 crt=50'39 lcod 48'38 mlcod 0'0 unknown pruub 189.167221069s@ mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:31 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:31 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb20002bc0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:42:32 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:32 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb00001ac0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:42:32 np0005548916 podman[85059]: 2025-12-06 09:42:32.714945051 +0000 UTC m=+4.170299738 image pull 4a3a1ff181d97c6dcfa9138ad76eb99fa2c1e840298461d5a7a56133bc05b9a2 quay.io/ceph/keepalived:2.2.4
Dec  6 04:42:32 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e55 e55: 3 total, 3 up, 3 in
Dec  6 04:42:32 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": ".nfs", "var": "pg_num", "val": "32"}]: dispatch
Dec  6 04:42:32 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]: dispatch
Dec  6 04:42:32 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "16"}]: dispatch
Dec  6 04:42:32 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": ".nfs", "var": "pg_num", "val": "32"}]': finished
Dec  6 04:42:32 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]': finished
Dec  6 04:42:32 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "16"}]': finished
Dec  6 04:42:32 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num", "val": "32"}]: dispatch
Dec  6 04:42:32 np0005548916 podman[85059]: 2025-12-06 09:42:32.73659834 +0000 UTC m=+4.191952997 container create c79cdff044c61ed5770be1cce73d6f5b86e7b013e2720dae646f052e887cf12c (image=quay.io/ceph/keepalived:2.2.4, name=modest_pike, description=keepalived for Ceph, io.openshift.expose-services=, build-date=2023-02-22T09:23:20, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, vendor=Red Hat, Inc., com.redhat.component=keepalived-container, name=keepalived, vcs-type=git, architecture=x86_64, summary=Provides keepalived on RHEL 9 for Ceph., io.buildah.version=1.28.2, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1793, version=2.2.4, io.openshift.tags=Ceph keepalived, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.k8s.display-name=Keepalived on RHEL 9)
Dec  6 04:42:32 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 55 pg[6.c( v 50'39 lc 0'0 (0'0,50'39] local-lis/les=21/23 n=1 ec=54/21 lis/c=21/21 les/c/f=23/23/0 sis=54) [0] r=0 lpr=54 pi=[21,54)/1 crt=50'39 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:32 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 55 pg[6.8( v 50'39 lc 0'0 (0'0,50'39] local-lis/les=21/23 n=1 ec=54/21 lis/c=21/21 les/c/f=23/23/0 sis=54) [0] r=0 lpr=54 pi=[21,54)/1 crt=50'39 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:32 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 55 pg[6.9( v 50'39 lc 0'0 (0'0,50'39] local-lis/les=21/23 n=1 ec=54/21 lis/c=21/21 les/c/f=23/23/0 sis=54) [0] r=0 lpr=54 pi=[21,54)/1 crt=50'39 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:32 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 55 pg[6.a( v 50'39 lc 0'0 (0'0,50'39] local-lis/les=21/23 n=1 ec=54/21 lis/c=21/21 les/c/f=23/23/0 sis=54) [0] r=0 lpr=54 pi=[21,54)/1 crt=50'39 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:32 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 55 pg[6.e( v 50'39 lc 0'0 (0'0,50'39] local-lis/les=21/23 n=1 ec=54/21 lis/c=21/21 les/c/f=23/23/0 sis=54) [0] r=0 lpr=54 pi=[21,54)/1 crt=50'39 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:32 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 55 pg[6.f( v 50'39 lc 0'0 (0'0,50'39] local-lis/les=21/23 n=1 ec=54/21 lis/c=21/21 les/c/f=23/23/0 sis=54) [0] r=0 lpr=54 pi=[21,54)/1 crt=50'39 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:32 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 55 pg[6.5( v 50'39 lc 0'0 (0'0,50'39] local-lis/les=21/23 n=2 ec=54/21 lis/c=21/21 les/c/f=23/23/0 sis=54) [0] r=0 lpr=54 pi=[21,54)/1 crt=50'39 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:32 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 55 pg[6.2( v 50'39 lc 0'0 (0'0,50'39] local-lis/les=21/23 n=2 ec=54/21 lis/c=21/21 les/c/f=23/23/0 sis=54) [0] r=0 lpr=54 pi=[21,54)/1 crt=50'39 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:32 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 55 pg[6.3( v 50'39 lc 0'0 (0'0,50'39] local-lis/les=21/23 n=2 ec=54/21 lis/c=21/21 les/c/f=23/23/0 sis=54) [0] r=0 lpr=54 pi=[21,54)/1 crt=50'39 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:32 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 55 pg[6.4( v 50'39 lc 0'0 (0'0,50'39] local-lis/les=21/23 n=2 ec=54/21 lis/c=21/21 les/c/f=23/23/0 sis=54) [0] r=0 lpr=54 pi=[21,54)/1 crt=50'39 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:32 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 55 pg[6.7( v 50'39 lc 0'0 (0'0,50'39] local-lis/les=21/23 n=1 ec=54/21 lis/c=21/21 les/c/f=23/23/0 sis=54) [0] r=0 lpr=54 pi=[21,54)/1 crt=50'39 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:32 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 55 pg[6.6( v 50'39 lc 0'0 (0'0,50'39] local-lis/les=21/23 n=2 ec=54/21 lis/c=21/21 les/c/f=23/23/0 sis=54) [0] r=0 lpr=54 pi=[21,54)/1 crt=50'39 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:32 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 55 pg[6.1( v 50'39 (0'0,50'39] local-lis/les=21/23 n=2 ec=54/21 lis/c=21/21 les/c/f=23/23/0 sis=54) [0] r=0 lpr=54 pi=[21,54)/1 crt=50'39 lcod 0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:32 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 55 pg[6.d( v 50'39 lc 0'0 (0'0,50'39] local-lis/les=21/23 n=1 ec=54/21 lis/c=21/21 les/c/f=23/23/0 sis=54) [0] r=0 lpr=54 pi=[21,54)/1 crt=50'39 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:32 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 55 pg[6.b( v 50'39 lc 0'0 (0'0,50'39] local-lis/les=21/23 n=1 ec=54/21 lis/c=21/21 les/c/f=23/23/0 sis=54) [0] r=0 lpr=54 pi=[21,54)/1 crt=50'39 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:32 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 55 pg[6.8( v 50'39 (0'0,50'39] local-lis/les=54/55 n=1 ec=54/21 lis/c=21/21 les/c/f=23/23/0 sis=54) [0] r=0 lpr=54 pi=[21,54)/1 crt=50'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:32 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 55 pg[6.c( v 50'39 (0'0,50'39] local-lis/les=54/55 n=1 ec=54/21 lis/c=21/21 les/c/f=23/23/0 sis=54) [0] r=0 lpr=54 pi=[21,54)/1 crt=50'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:32 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 55 pg[6.9( v 50'39 (0'0,50'39] local-lis/les=54/55 n=1 ec=54/21 lis/c=21/21 les/c/f=23/23/0 sis=54) [0] r=0 lpr=54 pi=[21,54)/1 crt=50'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:32 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 55 pg[6.e( v 50'39 (0'0,50'39] local-lis/les=54/55 n=1 ec=54/21 lis/c=21/21 les/c/f=23/23/0 sis=54) [0] r=0 lpr=54 pi=[21,54)/1 crt=50'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:32 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 55 pg[6.f( v 50'39 (0'0,50'39] local-lis/les=54/55 n=1 ec=54/21 lis/c=21/21 les/c/f=23/23/0 sis=54) [0] r=0 lpr=54 pi=[21,54)/1 crt=50'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:32 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 55 pg[6.2( v 50'39 (0'0,50'39] local-lis/les=54/55 n=2 ec=54/21 lis/c=21/21 les/c/f=23/23/0 sis=54) [0] r=0 lpr=54 pi=[21,54)/1 crt=50'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:32 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 55 pg[6.5( v 50'39 (0'0,50'39] local-lis/les=54/55 n=2 ec=54/21 lis/c=21/21 les/c/f=23/23/0 sis=54) [0] r=0 lpr=54 pi=[21,54)/1 crt=50'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:32 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 55 pg[6.0( v 50'39 (0'0,50'39] local-lis/les=54/55 n=1 ec=21/21 lis/c=21/21 les/c/f=23/23/0 sis=54) [0] r=0 lpr=54 pi=[21,54)/1 crt=50'39 lcod 48'38 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:32 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 55 pg[6.3( v 50'39 (0'0,50'39] local-lis/les=54/55 n=2 ec=54/21 lis/c=21/21 les/c/f=23/23/0 sis=54) [0] r=0 lpr=54 pi=[21,54)/1 crt=50'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:32 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 55 pg[6.4( v 50'39 (0'0,50'39] local-lis/les=54/55 n=2 ec=54/21 lis/c=21/21 les/c/f=23/23/0 sis=54) [0] r=0 lpr=54 pi=[21,54)/1 crt=50'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:32 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 55 pg[6.7( v 50'39 (0'0,50'39] local-lis/les=54/55 n=1 ec=54/21 lis/c=21/21 les/c/f=23/23/0 sis=54) [0] r=0 lpr=54 pi=[21,54)/1 crt=50'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:32 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 55 pg[6.6( v 50'39 (0'0,50'39] local-lis/les=54/55 n=2 ec=54/21 lis/c=21/21 les/c/f=23/23/0 sis=54) [0] r=0 lpr=54 pi=[21,54)/1 crt=50'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:32 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 55 pg[6.1( v 50'39 (0'0,50'39] local-lis/les=54/55 n=2 ec=54/21 lis/c=21/21 les/c/f=23/23/0 sis=54) [0] r=0 lpr=54 pi=[21,54)/1 crt=50'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:32 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 55 pg[6.d( v 50'39 (0'0,50'39] local-lis/les=54/55 n=1 ec=54/21 lis/c=21/21 les/c/f=23/23/0 sis=54) [0] r=0 lpr=54 pi=[21,54)/1 crt=50'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:32 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 55 pg[6.b( v 50'39 (0'0,50'39] local-lis/les=54/55 n=1 ec=54/21 lis/c=21/21 les/c/f=23/23/0 sis=54) [0] r=0 lpr=54 pi=[21,54)/1 crt=50'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:32 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 55 pg[6.a( v 50'39 (0'0,50'39] local-lis/les=54/55 n=1 ec=54/21 lis/c=21/21 les/c/f=23/23/0 sis=54) [0] r=0 lpr=54 pi=[21,54)/1 crt=50'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:32 np0005548916 systemd[1]: Started libpod-conmon-c79cdff044c61ed5770be1cce73d6f5b86e7b013e2720dae646f052e887cf12c.scope.
Dec  6 04:42:32 np0005548916 systemd[1]: Started libcrun container.
Dec  6 04:42:32 np0005548916 podman[85059]: 2025-12-06 09:42:32.829632068 +0000 UTC m=+4.284986765 container init c79cdff044c61ed5770be1cce73d6f5b86e7b013e2720dae646f052e887cf12c (image=quay.io/ceph/keepalived:2.2.4, name=modest_pike, description=keepalived for Ceph, name=keepalived, build-date=2023-02-22T09:23:20, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, release=1793, com.redhat.component=keepalived-container, io.openshift.expose-services=, io.k8s.display-name=Keepalived on RHEL 9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.28.2, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=Ceph keepalived, summary=Provides keepalived on RHEL 9 for Ceph., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, distribution-scope=public, version=2.2.4)
Dec  6 04:42:32 np0005548916 podman[85059]: 2025-12-06 09:42:32.839327551 +0000 UTC m=+4.294682208 container start c79cdff044c61ed5770be1cce73d6f5b86e7b013e2720dae646f052e887cf12c (image=quay.io/ceph/keepalived:2.2.4, name=modest_pike, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, version=2.2.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Keepalived on RHEL 9, release=1793, io.buildah.version=1.28.2, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, architecture=x86_64, summary=Provides keepalived on RHEL 9 for Ceph., description=keepalived for Ceph, name=keepalived, io.openshift.expose-services=, io.openshift.tags=Ceph keepalived, build-date=2023-02-22T09:23:20, vendor=Red Hat, Inc., com.redhat.component=keepalived-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793)
Dec  6 04:42:32 np0005548916 podman[85059]: 2025-12-06 09:42:32.843305636 +0000 UTC m=+4.298660313 container attach c79cdff044c61ed5770be1cce73d6f5b86e7b013e2720dae646f052e887cf12c (image=quay.io/ceph/keepalived:2.2.4, name=modest_pike, io.buildah.version=1.28.2, com.redhat.component=keepalived-container, io.openshift.expose-services=, io.k8s.display-name=Keepalived on RHEL 9, vcs-type=git, distribution-scope=public, description=keepalived for Ceph, name=keepalived, build-date=2023-02-22T09:23:20, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1793, vendor=Red Hat, Inc., summary=Provides keepalived on RHEL 9 for Ceph., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=Ceph keepalived, architecture=x86_64, version=2.2.4)
Dec  6 04:42:32 np0005548916 modest_pike[85157]: 0 0
Dec  6 04:42:32 np0005548916 systemd[1]: libpod-c79cdff044c61ed5770be1cce73d6f5b86e7b013e2720dae646f052e887cf12c.scope: Deactivated successfully.
Dec  6 04:42:32 np0005548916 podman[85059]: 2025-12-06 09:42:32.849375501 +0000 UTC m=+4.304730158 container died c79cdff044c61ed5770be1cce73d6f5b86e7b013e2720dae646f052e887cf12c (image=quay.io/ceph/keepalived:2.2.4, name=modest_pike, io.openshift.expose-services=, io.openshift.tags=Ceph keepalived, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, version=2.2.4, io.k8s.display-name=Keepalived on RHEL 9, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, release=1793, summary=Provides keepalived on RHEL 9 for Ceph., io.buildah.version=1.28.2, name=keepalived, build-date=2023-02-22T09:23:20, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, description=keepalived for Ceph, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, com.redhat.component=keepalived-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec  6 04:42:32 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e55 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  6 04:42:32 np0005548916 systemd[1]: var-lib-containers-storage-overlay-d3df7fa136e824db67690e4f52b9f64af4b6569d1a91c442596b9d250f39f423-merged.mount: Deactivated successfully.
Dec  6 04:42:32 np0005548916 podman[85059]: 2025-12-06 09:42:32.995666445 +0000 UTC m=+4.451021102 container remove c79cdff044c61ed5770be1cce73d6f5b86e7b013e2720dae646f052e887cf12c (image=quay.io/ceph/keepalived:2.2.4, name=modest_pike, vendor=Red Hat, Inc., summary=Provides keepalived on RHEL 9 for Ceph., vcs-type=git, io.buildah.version=1.28.2, description=keepalived for Ceph, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=Ceph keepalived, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, version=2.2.4, com.redhat.component=keepalived-container, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2023-02-22T09:23:20, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.k8s.display-name=Keepalived on RHEL 9, release=1793, architecture=x86_64, name=keepalived)
Dec  6 04:42:33 np0005548916 systemd[1]: libpod-conmon-c79cdff044c61ed5770be1cce73d6f5b86e7b013e2720dae646f052e887cf12c.scope: Deactivated successfully.
Dec  6 04:42:33 np0005548916 systemd[1]: Reloading.
Dec  6 04:42:33 np0005548916 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:42:33 np0005548916 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:42:33 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:33 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feafc0016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:42:33 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 6.c scrub starts
Dec  6 04:42:33 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 6.c scrub ok
Dec  6 04:42:33 np0005548916 systemd[1]: Reloading.
Dec  6 04:42:33 np0005548916 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:42:33 np0005548916 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:42:33 np0005548916 systemd[1]: Starting Ceph keepalived.nfs.cephfs.compute-1.uzbtlt for 5ecd3f74-dade-5fc4-92ce-8950ae424258...
Dec  6 04:42:33 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num", "val": "32"}]': finished
Dec  6 04:42:33 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num", "val": "32"}]: dispatch
Dec  6 04:42:33 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": ".nfs", "var": "pg_num_actual", "val": "32"}]: dispatch
Dec  6 04:42:33 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num_actual", "val": "32"}]: dispatch
Dec  6 04:42:33 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e56 e56: 3 total, 3 up, 3 in
Dec  6 04:42:33 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 56 pg[8.0( v 51'44 (0'0,51'44] local-lis/les=40/41 n=5 ec=40/40 lis/c=40/40 les/c/f=41/41/0 sis=56 pruub=12.352684975s) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 lcod 51'43 mlcod 51'43 active pruub 190.152496338s@ mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [0], acting_primary 0 -> 0, up_primary 0 -> 0, role 0 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:33 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 56 pg[9.0( v 44'12 (0'0,44'12] local-lis/les=43/44 n=6 ec=43/43 lis/c=43/43 les/c/f=44/44/0 sis=56 pruub=14.953887939s) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 lcod 44'11 mlcod 44'11 active pruub 192.753723145s@ mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [0], acting_primary 0 -> 0, up_primary 0 -> 0, role 0 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:33 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 56 pg[9.0( v 44'12 lc 0'0 (0'0,44'12] local-lis/les=43/44 n=0 ec=43/43 lis/c=43/43 les/c/f=44/44/0 sis=56 pruub=14.953887939s) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 lcod 44'11 mlcod 0'0 unknown pruub 192.753723145s@ mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:33 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 56 pg[8.0( v 51'44 lc 0'0 (0'0,51'44] local-lis/les=40/41 n=0 ec=40/40 lis/c=40/40 les/c/f=41/41/0 sis=56 pruub=12.352684975s) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 lcod 51'43 mlcod 0'0 unknown pruub 190.152496338s@ mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:33 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0).collection(8.0_head 0x55fb252cc6c0) operator()   moving buffer(0x55fb24a7f7e8 space 0x55fb24ac2760 0x0~1000 clean)
Dec  6 04:42:33 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:33 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb18002520 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:42:34 np0005548916 podman[85303]: 2025-12-06 09:42:34.019204561 +0000 UTC m=+0.048339699 container create c8ec7212805c01399bc295ce2c5e69b11fbde393e887859b5ab336e81cd6d1f1 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-1-uzbtlt, version=2.2.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=keepalived for Ceph, architecture=x86_64, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, name=keepalived, release=1793, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., io.k8s.display-name=Keepalived on RHEL 9, io.buildah.version=1.28.2, vcs-type=git, build-date=2023-02-22T09:23:20, summary=Provides keepalived on RHEL 9 for Ceph., com.redhat.component=keepalived-container, io.openshift.tags=Ceph keepalived, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, distribution-scope=public)
Dec  6 04:42:34 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5954e0c88d233d83f2fcfd99401ef442e34ef24b6527071b725e5489ae056436/merged/etc/keepalived/keepalived.conf supports timestamps until 2038 (0x7fffffff)
Dec  6 04:42:34 np0005548916 podman[85303]: 2025-12-06 09:42:34.080313145 +0000 UTC m=+0.109448313 container init c8ec7212805c01399bc295ce2c5e69b11fbde393e887859b5ab336e81cd6d1f1 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-1-uzbtlt, name=keepalived, vendor=Red Hat, Inc., vcs-type=git, description=keepalived for Ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides keepalived on RHEL 9 for Ceph., com.redhat.component=keepalived-container, distribution-scope=public, release=1793, io.k8s.display-name=Keepalived on RHEL 9, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, build-date=2023-02-22T09:23:20, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.openshift.expose-services=, architecture=x86_64, io.buildah.version=1.28.2, version=2.2.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=Ceph keepalived)
Dec  6 04:42:34 np0005548916 podman[85303]: 2025-12-06 09:42:34.087071147 +0000 UTC m=+0.116206285 container start c8ec7212805c01399bc295ce2c5e69b11fbde393e887859b5ab336e81cd6d1f1 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-1-uzbtlt, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, description=keepalived for Ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2023-02-22T09:23:20, release=1793, io.buildah.version=1.28.2, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, com.redhat.component=keepalived-container, name=keepalived, vendor=Red Hat, Inc., version=2.2.4, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, vcs-type=git, io.openshift.tags=Ceph keepalived, summary=Provides keepalived on RHEL 9 for Ceph., distribution-scope=public, io.k8s.display-name=Keepalived on RHEL 9, architecture=x86_64)
Dec  6 04:42:34 np0005548916 bash[85303]: c8ec7212805c01399bc295ce2c5e69b11fbde393e887859b5ab336e81cd6d1f1
Dec  6 04:42:34 np0005548916 podman[85303]: 2025-12-06 09:42:34.001242331 +0000 UTC m=+0.030377489 image pull 4a3a1ff181d97c6dcfa9138ad76eb99fa2c1e840298461d5a7a56133bc05b9a2 quay.io/ceph/keepalived:2.2.4
Dec  6 04:42:34 np0005548916 systemd[1]: Started Ceph keepalived.nfs.cephfs.compute-1.uzbtlt for 5ecd3f74-dade-5fc4-92ce-8950ae424258.
Dec  6 04:42:34 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-1-uzbtlt[85318]: Sat Dec  6 09:42:34 2025: Starting Keepalived v2.2.4 (08/21,2021)
Dec  6 04:42:34 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-1-uzbtlt[85318]: Sat Dec  6 09:42:34 2025: Running on Linux 5.14.0-645.el9.x86_64 #1 SMP PREEMPT_DYNAMIC Fri Nov 28 14:01:17 UTC 2025 (built for Linux 5.14.0)
Dec  6 04:42:34 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-1-uzbtlt[85318]: Sat Dec  6 09:42:34 2025: Command line: '/usr/sbin/keepalived' '-n' '-l' '-f' '/etc/keepalived/keepalived.conf'
Dec  6 04:42:34 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-1-uzbtlt[85318]: Sat Dec  6 09:42:34 2025: Configuration file /etc/keepalived/keepalived.conf
Dec  6 04:42:34 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-1-uzbtlt[85318]: Sat Dec  6 09:42:34 2025: NOTICE: setting config option max_auto_priority should result in better keepalived performance
Dec  6 04:42:34 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-1-uzbtlt[85318]: Sat Dec  6 09:42:34 2025: Starting VRRP child process, pid=4
Dec  6 04:42:34 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-1-uzbtlt[85318]: Sat Dec  6 09:42:34 2025: Startup complete
Dec  6 04:42:34 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-1-uzbtlt[85318]: Sat Dec  6 09:42:34 2025: (VI_0) Entering BACKUP STATE (init)
Dec  6 04:42:34 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-1-uzbtlt[85318]: Sat Dec  6 09:42:34 2025: VRRP_Script(check_backend) succeeded
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 6.8 scrub starts
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 6.8 scrub ok
Dec  6 04:42:34 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:34 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb20002bc0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:42:34 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num", "val": "32"}]': finished
Dec  6 04:42:34 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": ".nfs", "var": "pg_num_actual", "val": "32"}]': finished
Dec  6 04:42:34 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num_actual", "val": "32"}]': finished
Dec  6 04:42:34 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num", "val": "32"}]: dispatch
Dec  6 04:42:34 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:42:34 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:42:34 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:42:34 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e57 e57: 3 total, 3 up, 3 in
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.1e( v 44'12 lc 0'0 (0'0,44'12] local-lis/les=43/44 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.19( v 44'12 lc 0'0 (0'0,44'12] local-lis/les=43/44 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.18( v 51'44 lc 0'0 (0'0,51'44] local-lis/les=40/41 n=0 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.1f( v 51'44 lc 0'0 (0'0,51'44] local-lis/les=40/41 n=0 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.17( v 51'44 lc 0'0 (0'0,51'44] local-lis/les=40/41 n=0 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.16( v 44'12 lc 0'0 (0'0,44'12] local-lis/les=43/44 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.17( v 44'12 lc 0'0 (0'0,44'12] local-lis/les=43/44 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.16( v 51'44 lc 0'0 (0'0,51'44] local-lis/les=40/41 n=0 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.11( v 51'44 lc 0'0 (0'0,51'44] local-lis/les=40/41 n=0 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.3( v 44'12 lc 0'0 (0'0,44'12] local-lis/les=43/44 n=1 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.2( v 51'44 lc 0'0 (0'0,51'44] local-lis/les=40/41 n=1 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.4( v 44'12 lc 0'0 (0'0,44'12] local-lis/les=43/44 n=1 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.10( v 44'12 lc 0'0 (0'0,44'12] local-lis/les=43/44 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.7( v 44'12 lc 0'0 (0'0,44'12] local-lis/les=43/44 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.5( v 51'44 lc 0'0 (0'0,51'44] local-lis/les=40/41 n=1 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.6( v 51'44 lc 0'0 (0'0,51'44] local-lis/les=40/41 n=0 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.12( v 51'44 lc 0'0 (0'0,51'44] local-lis/les=40/41 n=0 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.13( v 44'12 lc 0'0 (0'0,44'12] local-lis/les=43/44 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.12( v 44'12 lc 0'0 (0'0,44'12] local-lis/les=43/44 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.1d( v 44'12 lc 0'0 (0'0,44'12] local-lis/les=43/44 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.1c( v 51'44 lc 0'0 (0'0,51'44] local-lis/les=40/41 n=0 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.1d( v 51'44 lc 0'0 (0'0,51'44] local-lis/les=40/41 n=0 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.13( v 51'44 lc 0'0 (0'0,51'44] local-lis/les=40/41 n=0 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.1c( v 44'12 lc 0'0 (0'0,44'12] local-lis/les=43/44 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.1f( v 44'12 lc 0'0 (0'0,44'12] local-lis/les=43/44 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.1e( v 51'44 lc 0'0 (0'0,51'44] local-lis/les=40/41 n=0 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.18( v 44'12 lc 0'0 (0'0,44'12] local-lis/les=43/44 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.19( v 51'44 lc 0'0 (0'0,51'44] local-lis/les=40/41 n=0 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.1a( v 51'44 lc 0'0 (0'0,51'44] local-lis/les=40/41 n=0 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.1a( v 44'12 lc 0'0 (0'0,44'12] local-lis/les=43/44 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.5( v 44'12 lc 0'0 (0'0,44'12] local-lis/les=43/44 n=1 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.4( v 51'44 lc 0'0 (0'0,51'44] local-lis/les=40/41 n=1 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.1b( v 51'44 lc 0'0 (0'0,51'44] local-lis/les=40/41 n=0 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.1b( v 44'12 lc 0'0 (0'0,44'12] local-lis/les=43/44 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.6( v 44'12 lc 0'0 (0'0,44'12] local-lis/les=43/44 n=1 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.7( v 51'44 lc 0'0 (0'0,51'44] local-lis/les=40/41 n=0 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.1( v 44'12 (0'0,44'12] local-lis/les=43/44 n=1 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 lcod 0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.1( v 51'44 (0'0,51'44] local-lis/les=40/41 n=1 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 lcod 0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.a( v 44'12 lc 0'0 (0'0,44'12] local-lis/les=43/44 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.b( v 51'44 lc 0'0 (0'0,51'44] local-lis/les=40/41 n=0 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.c( v 51'44 lc 0'0 (0'0,51'44] local-lis/les=40/41 n=0 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.d( v 44'12 lc 0'0 (0'0,44'12] local-lis/les=43/44 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.d( v 51'44 lc 0'0 (0'0,51'44] local-lis/les=40/41 n=0 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.e( v 51'44 lc 0'0 (0'0,51'44] local-lis/les=40/41 n=0 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.f( v 44'12 lc 0'0 (0'0,44'12] local-lis/les=43/44 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.a( v 51'44 lc 0'0 (0'0,51'44] local-lis/les=40/41 n=0 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.b( v 44'12 lc 0'0 (0'0,44'12] local-lis/les=43/44 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.9( v 51'44 lc 0'0 (0'0,51'44] local-lis/les=40/41 n=0 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.8( v 44'12 lc 0'0 (0'0,44'12] local-lis/les=43/44 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.9( v 44'12 lc 0'0 (0'0,44'12] local-lis/les=43/44 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.8( v 51'44 lc 0'0 (0'0,51'44] local-lis/les=40/41 n=0 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.c( v 44'12 lc 0'0 (0'0,44'12] local-lis/les=43/44 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.e( v 44'12 lc 0'0 (0'0,44'12] local-lis/les=43/44 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.2( v 44'12 lc 0'0 (0'0,44'12] local-lis/les=43/44 n=1 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.f( v 51'44 lc 0'0 (0'0,51'44] local-lis/les=40/41 n=0 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.3( v 51'44 lc 0'0 (0'0,51'44] local-lis/les=40/41 n=1 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.10( v 51'44 lc 0'0 (0'0,51'44] local-lis/les=40/41 n=0 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.11( v 44'12 lc 0'0 (0'0,44'12] local-lis/les=43/44 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.14( v 44'12 lc 0'0 (0'0,44'12] local-lis/les=43/44 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.15( v 51'44 lc 0'0 (0'0,51'44] local-lis/les=40/41 n=0 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.15( v 44'12 lc 0'0 (0'0,44'12] local-lis/les=43/44 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.14( v 51'44 lc 0'0 (0'0,51'44] local-lis/les=40/41 n=0 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.1e( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.19( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.16( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.17( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.16( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.17( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.18( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.3( v 44'12 (0'0,44'12] local-lis/les=56/57 n=1 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.1f( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.11( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.2( v 51'44 (0'0,51'44] local-lis/les=56/57 n=1 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.4( v 44'12 (0'0,44'12] local-lis/les=56/57 n=1 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.5( v 51'44 (0'0,51'44] local-lis/les=56/57 n=1 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.13( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.12( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.6( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.10( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.7( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.12( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.1c( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.1d( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.13( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.1d( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.1c( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.1f( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.1e( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.18( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.1a( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.4( v 51'44 (0'0,51'44] local-lis/les=56/57 n=1 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.1a( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.1b( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.5( v 44'12 (0'0,44'12] local-lis/les=56/57 n=1 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.19( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.6( v 44'12 (0'0,44'12] local-lis/les=56/57 n=1 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.7( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.1( v 44'12 (0'0,44'12] local-lis/les=56/57 n=1 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.0( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=40/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 lcod 51'43 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.1b( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.0( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=43/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 lcod 44'11 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.1( v 51'44 (0'0,51'44] local-lis/les=56/57 n=1 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.a( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.c( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.b( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.d( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.d( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.e( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.f( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.b( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.a( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.8( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.8( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.9( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.9( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.2( v 44'12 (0'0,44'12] local-lis/les=56/57 n=1 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.e( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.3( v 51'44 (0'0,51'44] local-lis/les=56/57 n=1 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.14( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.11( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.10( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.15( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.14( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.c( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.f( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:34 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.15( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:35 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:35 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb00002f50 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 6.9 scrub starts
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 6.9 scrub ok
Dec  6 04:42:35 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e58 e58: 3 total, 3 up, 3 in
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[7.10( empty local-lis/les=0/0 n=0 ec=54/22 lis/c=54/54 les/c/f=55/55/0 sis=58) [0] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[7.18( empty local-lis/les=0/0 n=0 ec=54/22 lis/c=54/54 les/c/f=55/55/0 sis=58) [0] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[7.1e( empty local-lis/les=0/0 n=0 ec=54/22 lis/c=54/54 les/c/f=55/55/0 sis=58) [0] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[7.9( empty local-lis/les=0/0 n=0 ec=54/22 lis/c=54/54 les/c/f=55/55/0 sis=58) [0] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[7.13( empty local-lis/les=0/0 n=0 ec=54/22 lis/c=54/54 les/c/f=55/55/0 sis=58) [0] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[7.b( empty local-lis/les=0/0 n=0 ec=54/22 lis/c=54/54 les/c/f=55/55/0 sis=58) [0] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[7.8( empty local-lis/les=0/0 n=0 ec=54/22 lis/c=54/54 les/c/f=55/55/0 sis=58) [0] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[7.f( empty local-lis/les=0/0 n=0 ec=54/22 lis/c=54/54 les/c/f=55/55/0 sis=58) [0] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[7.e( empty local-lis/les=0/0 n=0 ec=54/22 lis/c=54/54 les/c/f=55/55/0 sis=58) [0] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[7.4( empty local-lis/les=0/0 n=0 ec=54/22 lis/c=54/54 les/c/f=55/55/0 sis=58) [0] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[7.3( empty local-lis/les=0/0 n=0 ec=54/22 lis/c=54/54 les/c/f=55/55/0 sis=58) [0] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[7.2( empty local-lis/les=0/0 n=0 ec=54/22 lis/c=54/54 les/c/f=55/55/0 sis=58) [0] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[7.6( empty local-lis/les=0/0 n=0 ec=54/22 lis/c=54/54 les/c/f=55/55/0 sis=58) [0] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[7.1b( empty local-lis/les=0/0 n=0 ec=54/22 lis/c=54/54 les/c/f=55/55/0 sis=58) [0] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[8.14( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.993197441s) [1] r=-1 lpr=58 pi=[56,58)/1 crt=51'44 lcod 0'0 mlcod 0'0 active pruub 194.830062866s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[8.14( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.993165016s) [1] r=-1 lpr=58 pi=[56,58)/1 crt=51'44 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 194.830062866s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[8.10( v 57'45 (0'0,57'45] local-lis/les=56/57 n=0 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.992980957s) [1] r=-1 lpr=58 pi=[56,58)/1 crt=51'44 lcod 51'44 mlcod 51'44 active pruub 194.830017090s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[8.10( v 57'45 (0'0,57'45] local-lis/les=56/57 n=0 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.992946625s) [1] r=-1 lpr=58 pi=[56,58)/1 crt=51'44 lcod 51'44 mlcod 0'0 unknown NOTIFY pruub 194.830017090s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[9.11( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.992822647s) [1] r=-1 lpr=58 pi=[56,58)/1 crt=44'12 lcod 0'0 mlcod 0'0 active pruub 194.830001831s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[8.15( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.992801666s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=51'44 lcod 0'0 mlcod 0'0 active pruub 194.830017090s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[11.0( v 48'48 (0'0,48'48] local-lis/les=47/48 n=8 ec=47/47 lis/c=47/47 les/c/f=48/48/0 sis=58 pruub=9.013495445s) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 lcod 48'47 mlcod 48'47 active pruub 188.850738525s@ mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [0], acting_primary 0 -> 0, up_primary 0 -> 0, role 0 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[9.11( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.992766380s) [1] r=-1 lpr=58 pi=[56,58)/1 crt=44'12 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 194.830001831s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[6.d( v 50'39 (0'0,50'39] local-lis/les=54/55 n=1 ec=54/21 lis/c=54/54 les/c/f=55/55/0 sis=58 pruub=12.961063385s) [2] r=-1 lpr=58 pi=[54,58)/1 crt=50'39 lcod 0'0 mlcod 0'0 active pruub 192.798477173s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[8.3( v 51'44 (0'0,51'44] local-lis/les=56/57 n=1 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.992439270s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=51'44 lcod 0'0 mlcod 0'0 active pruub 194.829940796s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[8.15( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.992457390s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=51'44 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 194.830017090s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[8.3( v 51'44 (0'0,51'44] local-lis/les=56/57 n=1 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.992345810s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=51'44 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 194.829940796s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[6.1( v 50'39 (0'0,50'39] local-lis/les=54/55 n=2 ec=54/21 lis/c=54/54 les/c/f=55/55/0 sis=58 pruub=12.960508347s) [2] r=-1 lpr=58 pi=[54,58)/1 crt=50'39 lcod 0'0 mlcod 0'0 active pruub 192.798431396s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[6.1( v 50'39 (0'0,50'39] local-lis/les=54/55 n=2 ec=54/21 lis/c=54/54 les/c/f=55/55/0 sis=58 pruub=12.960480690s) [2] r=-1 lpr=58 pi=[54,58)/1 crt=50'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 192.798431396s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[8.f( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.991816521s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=51'44 lcod 0'0 mlcod 0'0 active pruub 194.829956055s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[8.f( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.991789818s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=51'44 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 194.829956055s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[9.e( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.991558075s) [1] r=-1 lpr=58 pi=[56,58)/1 crt=44'12 lcod 0'0 mlcod 0'0 active pruub 194.829940796s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[8.8( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.991323471s) [1] r=-1 lpr=58 pi=[56,58)/1 crt=51'44 lcod 0'0 mlcod 0'0 active pruub 194.829711914s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[8.8( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.991302490s) [1] r=-1 lpr=58 pi=[56,58)/1 crt=51'44 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 194.829711914s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:42:35 np0005548916 ceph-mon[79770]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[9.9( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.991175652s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=44'12 lcod 0'0 mlcod 0'0 active pruub 194.829757690s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[9.9( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.991153717s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=44'12 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 194.829757690s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:42:35 np0005548916 ceph-mon[79770]: 192.168.122.2 is in 192.168.122.0/24 on compute-1 interface br-ex
Dec  6 04:42:35 np0005548916 ceph-mon[79770]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Dec  6 04:42:35 np0005548916 ceph-mon[79770]: Deploying daemon keepalived.nfs.cephfs.compute-0.ylrrzf on compute-0
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[6.7( v 50'39 (0'0,50'39] local-lis/les=54/55 n=1 ec=54/21 lis/c=54/54 les/c/f=55/55/0 sis=58 pruub=12.959497452s) [2] r=-1 lpr=58 pi=[54,58)/1 crt=50'39 lcod 0'0 mlcod 0'0 active pruub 192.798324585s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[6.7( v 50'39 (0'0,50'39] local-lis/les=54/55 n=1 ec=54/21 lis/c=54/54 les/c/f=55/55/0 sis=58 pruub=12.959472656s) [2] r=-1 lpr=58 pi=[54,58)/1 crt=50'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 192.798324585s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:42:35 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num", "val": "32"}]': finished
Dec  6 04:42:35 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num", "val": "32"}]: dispatch
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[8.9( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.990754128s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=51'44 lcod 0'0 mlcod 0'0 active pruub 194.829757690s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[8.9( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.990729332s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=51'44 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 194.829757690s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:42:35 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num_actual", "val": "32"}]: dispatch
Dec  6 04:42:35 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num_actual", "val": "32"}]: dispatch
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[9.8( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.990464211s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=44'12 lcod 0'0 mlcod 0'0 active pruub 194.829650879s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:35 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": ".nfs", "var": "pgp_num_actual", "val": "32"}]: dispatch
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[9.8( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.990440369s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=44'12 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 194.829650879s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:42:35 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pgp_num_actual", "val": "32"}]: dispatch
Dec  6 04:42:35 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pgp_num_actual", "val": "32"}]: dispatch
Dec  6 04:42:35 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "2"}]: dispatch
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[8.a( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.990021706s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=51'44 lcod 0'0 mlcod 0'0 active pruub 194.829620361s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[8.a( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.989954948s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=51'44 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 194.829620361s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[9.b( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.989780426s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=44'12 lcod 0'0 mlcod 0'0 active pruub 194.829589844s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[9.b( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.989757538s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=44'12 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 194.829589844s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[9.f( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.989526749s) [1] r=-1 lpr=58 pi=[56,58)/1 crt=44'12 lcod 0'0 mlcod 0'0 active pruub 194.829559326s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[9.f( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.989503860s) [1] r=-1 lpr=58 pi=[56,58)/1 crt=44'12 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 194.829559326s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[9.e( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.991530418s) [1] r=-1 lpr=58 pi=[56,58)/1 crt=44'12 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 194.829940796s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[6.3( v 50'39 (0'0,50'39] local-lis/les=54/55 n=2 ec=54/21 lis/c=54/54 les/c/f=55/55/0 sis=58 pruub=12.957810402s) [2] r=-1 lpr=58 pi=[54,58)/1 crt=50'39 lcod 0'0 mlcod 0'0 active pruub 192.798278809s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[6.3( v 50'39 (0'0,50'39] local-lis/les=54/55 n=2 ec=54/21 lis/c=54/54 les/c/f=55/55/0 sis=58 pruub=12.957788467s) [2] r=-1 lpr=58 pi=[54,58)/1 crt=50'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 192.798278809s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[8.d( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.988817215s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=51'44 lcod 0'0 mlcod 0'0 active pruub 194.829467773s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[8.d( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.988798141s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=51'44 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 194.829467773s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[9.15( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.989374161s) [1] r=-1 lpr=58 pi=[56,58)/1 crt=44'12 lcod 0'0 mlcod 0'0 active pruub 194.830276489s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[8.c( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.988478661s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=51'44 lcod 0'0 mlcod 0'0 active pruub 194.829452515s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[8.c( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.988456726s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=51'44 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 194.829452515s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[9.15( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.989293098s) [1] r=-1 lpr=58 pi=[56,58)/1 crt=44'12 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 194.830276489s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[6.d( v 50'39 (0'0,50'39] local-lis/les=54/55 n=1 ec=54/21 lis/c=54/54 les/c/f=55/55/0 sis=58 pruub=12.960991859s) [2] r=-1 lpr=58 pi=[54,58)/1 crt=50'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 192.798477173s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[11.0( v 48'48 lc 0'0 (0'0,48'48] local-lis/les=47/48 n=0 ec=47/47 lis/c=47/47 les/c/f=48/48/0 sis=58 pruub=9.013495445s) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 lcod 48'47 mlcod 0'0 unknown pruub 188.850738525s@ mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[6.5( v 50'39 (0'0,50'39] local-lis/les=54/55 n=2 ec=54/21 lis/c=54/54 les/c/f=55/55/0 sis=58 pruub=12.956918716s) [2] r=-1 lpr=58 pi=[54,58)/1 crt=50'39 lcod 0'0 mlcod 0'0 active pruub 192.798202515s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[6.5( v 50'39 (0'0,50'39] local-lis/les=54/55 n=2 ec=54/21 lis/c=54/54 les/c/f=55/55/0 sis=58 pruub=12.956892014s) [2] r=-1 lpr=58 pi=[54,58)/1 crt=50'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 192.798202515s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[8.b( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.988080025s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=51'44 lcod 0'0 mlcod 0'0 active pruub 194.829467773s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[9.a( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.987852097s) [1] r=-1 lpr=58 pi=[56,58)/1 crt=44'12 lcod 0'0 mlcod 0'0 active pruub 194.829345703s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[9.a( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.987829208s) [1] r=-1 lpr=58 pi=[56,58)/1 crt=44'12 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 194.829345703s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[9.d( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.987977028s) [1] r=-1 lpr=58 pi=[56,58)/1 crt=44'12 lcod 0'0 mlcod 0'0 active pruub 194.829528809s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[9.d( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.987812996s) [1] r=-1 lpr=58 pi=[56,58)/1 crt=44'12 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 194.829528809s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[8.b( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.988037109s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=51'44 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 194.829467773s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[6.f( v 50'39 (0'0,50'39] local-lis/les=54/55 n=1 ec=54/21 lis/c=54/54 les/c/f=55/55/0 sis=58 pruub=12.956246376s) [2] r=-1 lpr=58 pi=[54,58)/1 crt=50'39 lcod 0'0 mlcod 0'0 active pruub 192.798141479s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[6.f( v 50'39 (0'0,50'39] local-lis/les=54/55 n=1 ec=54/21 lis/c=54/54 les/c/f=55/55/0 sis=58 pruub=12.956223488s) [2] r=-1 lpr=58 pi=[54,58)/1 crt=50'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 192.798141479s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[6.9( v 50'39 (0'0,50'39] local-lis/les=54/55 n=0 ec=54/21 lis/c=54/54 les/c/f=55/55/0 sis=58 pruub=12.955714226s) [2] r=-1 lpr=58 pi=[54,58)/1 crt=50'39 lcod 0'0 mlcod 0'0 active pruub 192.797897339s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[9.6( v 44'12 (0'0,44'12] local-lis/les=56/57 n=1 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.986533165s) [1] r=-1 lpr=58 pi=[56,58)/1 crt=44'12 lcod 0'0 mlcod 0'0 active pruub 194.828842163s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[6.9( v 50'39 (0'0,50'39] local-lis/les=54/55 n=0 ec=54/21 lis/c=54/54 les/c/f=55/55/0 sis=58 pruub=12.955690384s) [2] r=-1 lpr=58 pi=[54,58)/1 crt=50'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 192.797897339s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[9.6( v 44'12 (0'0,44'12] local-lis/les=56/57 n=1 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.986492157s) [1] r=-1 lpr=58 pi=[56,58)/1 crt=44'12 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 194.828842163s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[8.4( v 51'44 (0'0,51'44] local-lis/les=56/57 n=1 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.986289024s) [1] r=-1 lpr=58 pi=[56,58)/1 crt=51'44 lcod 0'0 mlcod 0'0 active pruub 194.828735352s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[8.4( v 51'44 (0'0,51'44] local-lis/les=56/57 n=1 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.986240387s) [1] r=-1 lpr=58 pi=[56,58)/1 crt=51'44 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 194.828735352s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[8.1b( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.986190796s) [1] r=-1 lpr=58 pi=[56,58)/1 crt=51'44 lcod 0'0 mlcod 0'0 active pruub 194.828781128s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[8.1b( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.986135483s) [1] r=-1 lpr=58 pi=[56,58)/1 crt=51'44 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 194.828781128s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[9.5( v 44'12 (0'0,44'12] local-lis/les=56/57 n=1 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.985842705s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=44'12 lcod 0'0 mlcod 0'0 active pruub 194.828796387s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[8.19( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.985840797s) [1] r=-1 lpr=58 pi=[56,58)/1 crt=51'44 lcod 0'0 mlcod 0'0 active pruub 194.828826904s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[8.19( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.985815048s) [1] r=-1 lpr=58 pi=[56,58)/1 crt=51'44 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 194.828826904s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[9.18( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.979178429s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=44'12 lcod 0'0 mlcod 0'0 active pruub 194.822219849s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[9.18( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.979141235s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=44'12 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 194.822219849s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[8.1c( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.978680611s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=51'44 lcod 0'0 mlcod 0'0 active pruub 194.822006226s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[8.1c( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.978662491s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=51'44 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 194.822006226s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[9.1d( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.978631020s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=44'12 lcod 0'0 mlcod 0'0 active pruub 194.822036743s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[9.1d( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.978603363s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=44'12 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 194.822036743s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[9.12( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.978453636s) [1] r=-1 lpr=58 pi=[56,58)/1 crt=44'12 lcod 0'0 mlcod 0'0 active pruub 194.821975708s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[9.12( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.978426933s) [1] r=-1 lpr=58 pi=[56,58)/1 crt=44'12 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 194.821975708s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[8.12( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.978094101s) [1] r=-1 lpr=58 pi=[56,58)/1 crt=51'44 lcod 0'0 mlcod 0'0 active pruub 194.821838379s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[8.6( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.978119850s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=51'44 lcod 0'0 mlcod 0'0 active pruub 194.821884155s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[8.6( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.978103638s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=51'44 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 194.821884155s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[9.13( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.978011131s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=44'12 lcod 0'0 mlcod 0'0 active pruub 194.821823120s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[8.12( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.978046417s) [1] r=-1 lpr=58 pi=[56,58)/1 crt=51'44 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 194.821838379s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[9.13( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.977977753s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=44'12 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 194.821823120s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[9.7( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.977997780s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=44'12 lcod 0'0 mlcod 0'0 active pruub 194.821929932s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[9.7( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.977980614s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=44'12 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 194.821929932s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[8.5( v 51'44 (0'0,51'44] local-lis/les=56/57 n=1 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.977673531s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=51'44 lcod 0'0 mlcod 0'0 active pruub 194.821838379s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[8.2( v 51'44 (0'0,51'44] local-lis/les=56/57 n=1 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.977571487s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=51'44 lcod 0'0 mlcod 0'0 active pruub 194.821746826s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[8.2( v 51'44 (0'0,51'44] local-lis/les=56/57 n=1 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.977553368s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=51'44 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 194.821746826s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[8.5( v 51'44 (0'0,51'44] local-lis/les=56/57 n=1 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.977593422s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=51'44 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 194.821838379s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[9.3( v 44'12 (0'0,44'12] local-lis/les=56/57 n=1 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.975193024s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=44'12 lcod 0'0 mlcod 0'0 active pruub 194.819534302s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[9.3( v 44'12 (0'0,44'12] local-lis/les=56/57 n=1 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.975163460s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=44'12 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 194.819534302s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[9.5( v 44'12 (0'0,44'12] local-lis/les=56/57 n=1 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.985816956s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=44'12 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 194.828796387s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[8.11( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.977272987s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=51'44 lcod 0'0 mlcod 0'0 active pruub 194.821731567s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[9.10( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.977396965s) [1] r=-1 lpr=58 pi=[56,58)/1 crt=44'12 lcod 0'0 mlcod 0'0 active pruub 194.821929932s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[8.11( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.977242470s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=51'44 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 194.821731567s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[9.10( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.977380753s) [1] r=-1 lpr=58 pi=[56,58)/1 crt=44'12 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 194.821929932s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[8.16( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.974814415s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=51'44 lcod 0'0 mlcod 0'0 active pruub 194.819503784s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[9.17( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.974786758s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=44'12 lcod 0'0 mlcod 0'0 active pruub 194.819503784s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[8.16( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.974771500s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=51'44 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 194.819503784s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[9.17( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.974769592s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=44'12 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 194.819503784s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[8.17( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.974672318s) [1] r=-1 lpr=58 pi=[56,58)/1 crt=51'44 lcod 0'0 mlcod 0'0 active pruub 194.819473267s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[8.17( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.974642754s) [1] r=-1 lpr=58 pi=[56,58)/1 crt=51'44 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 194.819473267s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[8.18( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.974593163s) [1] r=-1 lpr=58 pi=[56,58)/1 crt=51'44 lcod 0'0 mlcod 0'0 active pruub 194.819519043s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[8.18( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.974575996s) [1] r=-1 lpr=58 pi=[56,58)/1 crt=51'44 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 194.819519043s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[8.1f( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.974564552s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=51'44 lcod 0'0 mlcod 0'0 active pruub 194.819549561s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[8.1f( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.974538803s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=51'44 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 194.819549561s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[9.16( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.973692894s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=44'12 lcod 0'0 mlcod 0'0 active pruub 194.819473267s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[9.16( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.973628998s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=44'12 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 194.819473267s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[6.b( v 50'39 (0'0,50'39] local-lis/les=54/55 n=1 ec=54/21 lis/c=54/54 les/c/f=55/55/0 sis=58 pruub=12.952226639s) [2] r=-1 lpr=58 pi=[54,58)/1 crt=50'39 lcod 0'0 mlcod 0'0 active pruub 192.798522949s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:35 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[6.b( v 50'39 (0'0,50'39] local-lis/les=54/55 n=1 ec=54/21 lis/c=54/54 les/c/f=55/55/0 sis=58 pruub=12.952185631s) [2] r=-1 lpr=58 pi=[54,58)/1 crt=50'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 192.798522949s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:42:35 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:35 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb00002f50 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:42:36 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 9.14 scrub starts
Dec  6 04:42:36 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 9.14 scrub ok
Dec  6 04:42:36 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:36 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb18002520 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:42:36 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num", "val": "32"}]': finished
Dec  6 04:42:36 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num_actual", "val": "32"}]': finished
Dec  6 04:42:36 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num_actual", "val": "32"}]': finished
Dec  6 04:42:36 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": ".nfs", "var": "pgp_num_actual", "val": "32"}]': finished
Dec  6 04:42:36 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pgp_num_actual", "val": "32"}]': finished
Dec  6 04:42:36 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pgp_num_actual", "val": "32"}]': finished
Dec  6 04:42:36 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "2"}]': finished
Dec  6 04:42:36 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e59 e59: 3 total, 3 up, 3 in
Dec  6 04:42:36 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.17( v 48'48 lc 0'0 (0'0,48'48] local-lis/les=47/48 n=0 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:36 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.16( v 48'48 lc 0'0 (0'0,48'48] local-lis/les=47/48 n=0 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:36 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.13( v 48'48 lc 0'0 (0'0,48'48] local-lis/les=47/48 n=0 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:36 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.c( v 48'48 lc 0'0 (0'0,48'48] local-lis/les=47/48 n=0 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:36 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.b( v 48'48 lc 0'0 (0'0,48'48] local-lis/les=47/48 n=0 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:36 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.a( v 48'48 lc 0'0 (0'0,48'48] local-lis/les=47/48 n=0 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:36 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.9( v 48'48 lc 0'0 (0'0,48'48] local-lis/les=47/48 n=0 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:36 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.d( v 48'48 lc 0'0 (0'0,48'48] local-lis/les=47/48 n=0 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:36 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.e( v 48'48 lc 0'0 (0'0,48'48] local-lis/les=47/48 n=0 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:36 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.f( v 48'48 lc 0'0 (0'0,48'48] local-lis/les=47/48 n=0 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:36 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.8( v 48'48 lc 0'0 (0'0,48'48] local-lis/les=47/48 n=1 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:36 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.2( v 48'48 lc 0'0 (0'0,48'48] local-lis/les=47/48 n=1 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:36 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.3( v 48'48 lc 0'0 (0'0,48'48] local-lis/les=47/48 n=1 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:36 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.4( v 48'48 lc 0'0 (0'0,48'48] local-lis/les=47/48 n=1 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:36 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.7( v 48'48 lc 0'0 (0'0,48'48] local-lis/les=47/48 n=1 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:36 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.18( v 48'48 lc 0'0 (0'0,48'48] local-lis/les=47/48 n=0 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:36 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.19( v 48'48 lc 0'0 (0'0,48'48] local-lis/les=47/48 n=0 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:36 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.1a( v 48'48 lc 0'0 (0'0,48'48] local-lis/les=47/48 n=0 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:36 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.1d( v 48'48 lc 0'0 (0'0,48'48] local-lis/les=47/48 n=0 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:36 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.1e( v 48'48 lc 0'0 (0'0,48'48] local-lis/les=47/48 n=0 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:36 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.1f( v 48'48 lc 0'0 (0'0,48'48] local-lis/les=47/48 n=0 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:36 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.10( v 48'48 lc 0'0 (0'0,48'48] local-lis/les=47/48 n=0 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:36 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.11( v 48'48 lc 0'0 (0'0,48'48] local-lis/les=47/48 n=0 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:36 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.5( v 48'48 lc 0'0 (0'0,48'48] local-lis/les=47/48 n=1 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:36 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.6( v 48'48 lc 0'0 (0'0,48'48] local-lis/les=47/48 n=1 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:36 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.1( v 48'48 (0'0,48'48] local-lis/les=47/48 n=1 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 lcod 0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:36 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.12( v 48'48 lc 0'0 (0'0,48'48] local-lis/les=47/48 n=0 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:36 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.15( v 48'48 lc 0'0 (0'0,48'48] local-lis/les=47/48 n=0 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:36 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.14( v 48'48 lc 0'0 (0'0,48'48] local-lis/les=47/48 n=0 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:36 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.1b( v 48'48 lc 0'0 (0'0,48'48] local-lis/les=47/48 n=0 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:36 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.1c( v 48'48 lc 0'0 (0'0,48'48] local-lis/les=47/48 n=0 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:36 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.17( v 48'48 (0'0,48'48] local-lis/les=58/59 n=0 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:36 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[7.1b( empty local-lis/les=58/59 n=0 ec=54/22 lis/c=54/54 les/c/f=55/55/0 sis=58) [0] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:36 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.13( v 48'48 (0'0,48'48] local-lis/les=58/59 n=0 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:36 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.0( v 48'48 (0'0,48'48] local-lis/les=58/59 n=0 ec=47/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 lcod 48'47 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:36 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.c( v 48'48 (0'0,48'48] local-lis/les=58/59 n=0 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:36 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.16( v 48'48 (0'0,48'48] local-lis/les=58/59 n=0 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:36 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.b( v 48'48 (0'0,48'48] local-lis/les=58/59 n=0 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:36 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.a( v 48'48 (0'0,48'48] local-lis/les=58/59 n=0 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:36 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[7.6( empty local-lis/les=58/59 n=0 ec=54/22 lis/c=54/54 les/c/f=55/55/0 sis=58) [0] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:36 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.9( v 48'48 (0'0,48'48] local-lis/les=58/59 n=0 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:36 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.d( v 48'48 (0'0,48'48] local-lis/les=58/59 n=0 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:36 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[7.2( empty local-lis/les=58/59 n=0 ec=54/22 lis/c=54/54 les/c/f=55/55/0 sis=58) [0] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:36 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.e( v 48'48 (0'0,48'48] local-lis/les=58/59 n=0 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:36 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[7.3( empty local-lis/les=58/59 n=0 ec=54/22 lis/c=54/54 les/c/f=55/55/0 sis=58) [0] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:36 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.f( v 48'48 (0'0,48'48] local-lis/les=58/59 n=0 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:36 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[7.4( empty local-lis/les=58/59 n=0 ec=54/22 lis/c=54/54 les/c/f=55/55/0 sis=58) [0] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:36 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[7.e( empty local-lis/les=58/59 n=0 ec=54/22 lis/c=54/54 les/c/f=55/55/0 sis=58) [0] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:36 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[7.8( empty local-lis/les=58/59 n=0 ec=54/22 lis/c=54/54 les/c/f=55/55/0 sis=58) [0] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:36 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.2( v 48'48 (0'0,48'48] local-lis/les=58/59 n=1 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:36 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[7.f( empty local-lis/les=58/59 n=0 ec=54/22 lis/c=54/54 les/c/f=55/55/0 sis=58) [0] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:36 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.8( v 48'48 (0'0,48'48] local-lis/les=58/59 n=1 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:36 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.3( v 48'48 (0'0,48'48] local-lis/les=58/59 n=1 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:36 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.4( v 48'48 (0'0,48'48] local-lis/les=58/59 n=1 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:36 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[7.b( empty local-lis/les=58/59 n=0 ec=54/22 lis/c=54/54 les/c/f=55/55/0 sis=58) [0] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:36 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.18( v 48'48 (0'0,48'48] local-lis/les=58/59 n=0 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:36 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.7( v 48'48 (0'0,48'48] local-lis/les=58/59 n=1 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:36 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.19( v 48'48 (0'0,48'48] local-lis/les=58/59 n=0 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:36 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.1d( v 48'48 (0'0,48'48] local-lis/les=58/59 n=0 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:36 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.1a( v 48'48 (0'0,48'48] local-lis/les=58/59 n=0 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:36 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.1e( v 48'48 (0'0,48'48] local-lis/les=58/59 n=0 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:36 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.1f( v 48'48 (0'0,48'48] local-lis/les=58/59 n=0 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:36 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[7.13( empty local-lis/les=58/59 n=0 ec=54/22 lis/c=54/54 les/c/f=55/55/0 sis=58) [0] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:36 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.10( v 48'48 (0'0,48'48] local-lis/les=58/59 n=0 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:36 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[7.9( empty local-lis/les=58/59 n=0 ec=54/22 lis/c=54/54 les/c/f=55/55/0 sis=58) [0] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:36 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.11( v 48'48 (0'0,48'48] local-lis/les=58/59 n=0 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:36 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.5( v 48'48 (0'0,48'48] local-lis/les=58/59 n=1 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:36 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.6( v 48'48 (0'0,48'48] local-lis/les=58/59 n=1 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:36 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.1( v 48'48 (0'0,48'48] local-lis/les=58/59 n=1 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:36 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[7.1e( empty local-lis/les=58/59 n=0 ec=54/22 lis/c=54/54 les/c/f=55/55/0 sis=58) [0] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:36 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[7.18( empty local-lis/les=58/59 n=0 ec=54/22 lis/c=54/54 les/c/f=55/55/0 sis=58) [0] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:36 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.12( v 48'48 (0'0,48'48] local-lis/les=58/59 n=0 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:36 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.14( v 48'48 (0'0,48'48] local-lis/les=58/59 n=0 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:36 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[7.10( empty local-lis/les=58/59 n=0 ec=54/22 lis/c=54/54 les/c/f=55/55/0 sis=58) [0] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:36 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.1b( v 48'48 (0'0,48'48] local-lis/les=58/59 n=0 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:36 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.1c( v 48'48 (0'0,48'48] local-lis/les=58/59 n=0 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:36 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.15( v 48'48 (0'0,48'48] local-lis/les=58/59 n=0 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:37 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:37 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb200038d0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:42:37 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 9.2 scrub starts
Dec  6 04:42:37 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 9.2 scrub ok
Dec  6 04:42:37 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-1-uzbtlt[85318]: Sat Dec  6 09:42:37 2025: (VI_0) Entering MASTER STATE
Dec  6 04:42:37 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:37 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feafc002f00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:42:37 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e59 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  6 04:42:37 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e60 e60: 3 total, 3 up, 3 in
Dec  6 04:42:38 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num_actual", "val": "32"}]: dispatch
Dec  6 04:42:38 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 6.6 scrub starts
Dec  6 04:42:38 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 6.6 scrub ok
Dec  6 04:42:38 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:38 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb00002f50 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:42:39 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e61 e61: 3 total, 3 up, 3 in
Dec  6 04:42:39 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num_actual", "val": "32"}]': finished
Dec  6 04:42:39 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:39 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb18002520 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:42:39 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 6.4 scrub starts
Dec  6 04:42:39 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 6.4 scrub ok
Dec  6 04:42:39 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:39 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb200038d0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:42:40 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:42:40 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:42:40 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:42:40 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:42:40 np0005548916 ceph-mon[79770]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Dec  6 04:42:40 np0005548916 ceph-mon[79770]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Dec  6 04:42:40 np0005548916 ceph-mon[79770]: 192.168.122.2 is in 192.168.122.0/24 on compute-1 interface br-ex
Dec  6 04:42:40 np0005548916 ceph-mon[79770]: Deploying daemon keepalived.nfs.cephfs.compute-2.whsrlg on compute-2
Dec  6 04:42:40 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 6.0 scrub starts
Dec  6 04:42:40 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 6.0 scrub ok
Dec  6 04:42:40 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:40 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feafc002f00 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:42:41 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e62 e62: 3 total, 3 up, 3 in
Dec  6 04:42:41 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[12.1c( empty local-lis/les=0/0 n=0 ec=60/49 lis/c=60/60 les/c/f=61/61/0 sis=62) [0] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:41 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[12.12( empty local-lis/les=0/0 n=0 ec=60/49 lis/c=60/60 les/c/f=61/61/0 sis=62) [0] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:41 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[12.6( empty local-lis/les=0/0 n=0 ec=60/49 lis/c=60/60 les/c/f=61/61/0 sis=62) [0] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:41 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[12.19( empty local-lis/les=0/0 n=0 ec=60/49 lis/c=60/60 les/c/f=61/61/0 sis=62) [0] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:41 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[12.8( empty local-lis/les=0/0 n=0 ec=60/49 lis/c=60/60 les/c/f=61/61/0 sis=62) [0] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:41 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[12.a( empty local-lis/les=0/0 n=0 ec=60/49 lis/c=60/60 les/c/f=61/61/0 sis=62) [0] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:41 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[12.e( empty local-lis/les=0/0 n=0 ec=60/49 lis/c=60/60 les/c/f=61/61/0 sis=62) [0] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:41 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[12.c( empty local-lis/les=0/0 n=0 ec=60/49 lis/c=60/60 les/c/f=61/61/0 sis=62) [0] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:41 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[12.b( empty local-lis/les=0/0 n=0 ec=60/49 lis/c=60/60 les/c/f=61/61/0 sis=62) [0] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:41 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[12.10( empty local-lis/les=0/0 n=0 ec=60/49 lis/c=60/60 les/c/f=61/61/0 sis=62) [0] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:41 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[11.17( v 48'48 (0'0,48'48] local-lis/les=58/59 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=11.806502342s) [2] r=-1 lpr=62 pi=[58,62)/1 crt=48'48 lcod 0'0 mlcod 0'0 active pruub 197.025405884s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:41 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[11.17( v 48'48 (0'0,48'48] local-lis/les=58/59 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=11.806447029s) [2] r=-1 lpr=62 pi=[58,62)/1 crt=48'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 197.025405884s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:42:41 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[11.16( v 48'48 (0'0,48'48] local-lis/les=58/59 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=11.811082840s) [2] r=-1 lpr=62 pi=[58,62)/1 crt=48'48 lcod 0'0 mlcod 0'0 active pruub 197.030075073s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:41 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[11.16( v 48'48 (0'0,48'48] local-lis/les=58/59 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=11.811053276s) [2] r=-1 lpr=62 pi=[58,62)/1 crt=48'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 197.030075073s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:42:41 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[11.13( v 48'48 (0'0,48'48] local-lis/les=58/59 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=11.810568810s) [2] r=-1 lpr=62 pi=[58,62)/1 crt=48'48 lcod 0'0 mlcod 0'0 active pruub 197.030014038s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:41 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[11.13( v 48'48 (0'0,48'48] local-lis/les=58/59 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=11.810263634s) [2] r=-1 lpr=62 pi=[58,62)/1 crt=48'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 197.030014038s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:42:41 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[6.6( v 50'39 (0'0,50'39] local-lis/les=54/55 n=1 ec=54/21 lis/c=54/54 les/c/f=55/55/0 sis=62 pruub=15.578481674s) [1] r=-1 lpr=62 pi=[54,62)/1 crt=50'39 lcod 0'0 mlcod 0'0 active pruub 200.798706055s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:41 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[11.a( v 48'48 (0'0,48'48] local-lis/les=58/59 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=11.810064316s) [2] r=-1 lpr=62 pi=[58,62)/1 crt=48'48 lcod 0'0 mlcod 0'0 active pruub 197.030303955s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:41 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[6.6( v 50'39 (0'0,50'39] local-lis/les=54/55 n=1 ec=54/21 lis/c=54/54 les/c/f=55/55/0 sis=62 pruub=15.578426361s) [1] r=-1 lpr=62 pi=[54,62)/1 crt=50'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 200.798706055s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:42:41 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[11.a( v 48'48 (0'0,48'48] local-lis/les=58/59 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=11.810020447s) [2] r=-1 lpr=62 pi=[58,62)/1 crt=48'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 197.030303955s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:42:41 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[11.e( v 61'51 (0'0,61'51] local-lis/les=58/59 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=11.809412003s) [2] r=-1 lpr=62 pi=[58,62)/1 crt=59'49 lcod 59'50 mlcod 59'50 active pruub 197.030380249s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:41 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[11.f( v 48'48 (0'0,48'48] local-lis/les=58/59 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=11.809444427s) [1] r=-1 lpr=62 pi=[58,62)/1 crt=48'48 lcod 0'0 mlcod 0'0 active pruub 197.030410767s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:41 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[6.2( v 50'39 (0'0,50'39] local-lis/les=54/55 n=2 ec=54/21 lis/c=54/54 les/c/f=55/55/0 sis=62 pruub=15.577353477s) [1] r=-1 lpr=62 pi=[54,62)/1 crt=50'39 lcod 0'0 mlcod 0'0 active pruub 200.798385620s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:41 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[11.f( v 48'48 (0'0,48'48] local-lis/les=58/59 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=11.809404373s) [1] r=-1 lpr=62 pi=[58,62)/1 crt=48'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 197.030410767s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:42:41 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[11.e( v 61'51 (0'0,61'51] local-lis/les=58/59 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=11.809345245s) [2] r=-1 lpr=62 pi=[58,62)/1 crt=59'49 lcod 59'50 mlcod 0'0 unknown NOTIFY pruub 197.030380249s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:42:41 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[6.2( v 50'39 (0'0,50'39] local-lis/les=54/55 n=2 ec=54/21 lis/c=54/54 les/c/f=55/55/0 sis=62 pruub=15.577314377s) [1] r=-1 lpr=62 pi=[54,62)/1 crt=50'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 200.798385620s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:42:41 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[11.8( v 48'48 (0'0,48'48] local-lis/les=58/59 n=1 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=11.809409142s) [2] r=-1 lpr=62 pi=[58,62)/1 crt=48'48 lcod 0'0 mlcod 0'0 active pruub 197.030548096s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:41 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[11.8( v 48'48 (0'0,48'48] local-lis/les=58/59 n=1 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=11.809364319s) [2] r=-1 lpr=62 pi=[58,62)/1 crt=48'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 197.030548096s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:42:41 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[11.4( v 48'48 (0'0,48'48] local-lis/les=58/59 n=1 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=11.809011459s) [1] r=-1 lpr=62 pi=[58,62)/1 crt=48'48 lcod 0'0 mlcod 0'0 active pruub 197.030593872s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:41 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[6.e( v 50'39 (0'0,50'39] local-lis/les=54/55 n=1 ec=54/21 lis/c=54/54 les/c/f=55/55/0 sis=62 pruub=15.576561928s) [1] r=-1 lpr=62 pi=[54,62)/1 crt=50'39 lcod 0'0 mlcod 0'0 active pruub 200.798171997s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:41 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[11.4( v 48'48 (0'0,48'48] local-lis/les=58/59 n=1 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=11.808938026s) [1] r=-1 lpr=62 pi=[58,62)/1 crt=48'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 197.030593872s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:42:41 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[11.3( v 61'51 (0'0,61'51] local-lis/les=58/59 n=1 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=11.808858871s) [2] r=-1 lpr=62 pi=[58,62)/1 crt=59'49 lcod 59'50 mlcod 59'50 active pruub 197.030593872s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:41 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[6.e( v 50'39 (0'0,50'39] local-lis/les=54/55 n=1 ec=54/21 lis/c=54/54 les/c/f=55/55/0 sis=62 pruub=15.576468468s) [1] r=-1 lpr=62 pi=[54,62)/1 crt=50'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 200.798171997s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:42:41 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[11.3( v 61'51 (0'0,61'51] local-lis/les=58/59 n=1 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=11.808789253s) [2] r=-1 lpr=62 pi=[58,62)/1 crt=59'49 lcod 59'50 mlcod 0'0 unknown NOTIFY pruub 197.030593872s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:42:41 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[11.7( v 48'48 (0'0,48'48] local-lis/les=58/59 n=1 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=11.814203262s) [1] r=-1 lpr=62 pi=[58,62)/1 crt=48'48 lcod 0'0 mlcod 0'0 active pruub 197.036163330s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:41 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[11.7( v 48'48 (0'0,48'48] local-lis/les=58/59 n=1 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=11.814068794s) [1] r=-1 lpr=62 pi=[58,62)/1 crt=48'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 197.036163330s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:42:41 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[6.a( v 50'39 (0'0,50'39] local-lis/les=54/55 n=1 ec=54/21 lis/c=54/54 les/c/f=55/55/0 sis=62 pruub=15.576579094s) [1] r=-1 lpr=62 pi=[54,62)/1 crt=50'39 lcod 0'0 mlcod 0'0 active pruub 200.798721313s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:41 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[11.19( v 48'48 (0'0,48'48] local-lis/les=58/59 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=11.814126968s) [2] r=-1 lpr=62 pi=[58,62)/1 crt=48'48 lcod 0'0 mlcod 0'0 active pruub 197.036315918s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:41 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[6.a( v 50'39 (0'0,50'39] local-lis/les=54/55 n=1 ec=54/21 lis/c=54/54 les/c/f=55/55/0 sis=62 pruub=15.576539040s) [1] r=-1 lpr=62 pi=[54,62)/1 crt=50'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 200.798721313s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:42:41 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[11.19( v 48'48 (0'0,48'48] local-lis/les=58/59 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=11.814093590s) [2] r=-1 lpr=62 pi=[58,62)/1 crt=48'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 197.036315918s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:42:41 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[11.1d( v 48'48 (0'0,48'48] local-lis/les=58/59 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=11.813888550s) [1] r=-1 lpr=62 pi=[58,62)/1 crt=48'48 lcod 0'0 mlcod 0'0 active pruub 197.036331177s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:41 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[11.1d( v 48'48 (0'0,48'48] local-lis/les=58/59 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=11.813417435s) [1] r=-1 lpr=62 pi=[58,62)/1 crt=48'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 197.036331177s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:42:41 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[11.1( v 48'48 (0'0,48'48] local-lis/les=58/59 n=1 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=11.813626289s) [1] r=-1 lpr=62 pi=[58,62)/1 crt=48'48 lcod 0'0 mlcod 0'0 active pruub 197.036590576s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:41 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "3"}]: dispatch
Dec  6 04:42:41 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pgp_num_actual", "val": "32"}]: dispatch
Dec  6 04:42:41 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "2"}]: dispatch
Dec  6 04:42:41 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pgp_num_actual", "val": "32"}]: dispatch
Dec  6 04:42:41 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[11.1( v 48'48 (0'0,48'48] local-lis/les=58/59 n=1 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=11.813585281s) [1] r=-1 lpr=62 pi=[58,62)/1 crt=48'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 197.036590576s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:42:41 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[11.1e( v 48'48 (0'0,48'48] local-lis/les=58/59 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=11.813253403s) [1] r=-1 lpr=62 pi=[58,62)/1 crt=48'48 lcod 0'0 mlcod 0'0 active pruub 197.036346436s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:41 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[11.1e( v 48'48 (0'0,48'48] local-lis/les=58/59 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=11.813191414s) [1] r=-1 lpr=62 pi=[58,62)/1 crt=48'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 197.036346436s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:42:41 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[11.12( v 48'48 (0'0,48'48] local-lis/les=58/59 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=11.813434601s) [1] r=-1 lpr=62 pi=[58,62)/1 crt=48'48 lcod 0'0 mlcod 0'0 active pruub 197.036651611s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:41 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[11.12( v 48'48 (0'0,48'48] local-lis/les=58/59 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=11.813399315s) [1] r=-1 lpr=62 pi=[58,62)/1 crt=48'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 197.036651611s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:42:41 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[11.14( v 61'51 (0'0,61'51] local-lis/les=58/59 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=11.813307762s) [1] r=-1 lpr=62 pi=[58,62)/1 crt=59'49 lcod 59'50 mlcod 59'50 active pruub 197.036697388s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:41 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[11.14( v 61'51 (0'0,61'51] local-lis/les=58/59 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=11.813238144s) [1] r=-1 lpr=62 pi=[58,62)/1 crt=59'49 lcod 59'50 mlcod 0'0 unknown NOTIFY pruub 197.036697388s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:42:41 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[11.1b( v 48'48 (0'0,48'48] local-lis/les=58/59 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=11.813173294s) [1] r=-1 lpr=62 pi=[58,62)/1 crt=48'48 lcod 0'0 mlcod 0'0 active pruub 197.036697388s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:41 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[11.1b( v 48'48 (0'0,48'48] local-lis/les=58/59 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=11.813140869s) [1] r=-1 lpr=62 pi=[58,62)/1 crt=48'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 197.036697388s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:42:41 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[11.1a( v 48'48 (0'0,48'48] local-lis/les=58/59 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=11.812614441s) [1] r=-1 lpr=62 pi=[58,62)/1 crt=48'48 lcod 0'0 mlcod 0'0 active pruub 197.036331177s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:41 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[11.1a( v 48'48 (0'0,48'48] local-lis/les=58/59 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=11.812572479s) [1] r=-1 lpr=62 pi=[58,62)/1 crt=48'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 197.036331177s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:42:41 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[11.1c( v 48'48 (0'0,48'48] local-lis/les=58/59 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=11.812896729s) [1] r=-1 lpr=62 pi=[58,62)/1 crt=48'48 lcod 0'0 mlcod 0'0 active pruub 197.036743164s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:41 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[11.1c( v 48'48 (0'0,48'48] local-lis/les=58/59 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=11.812850952s) [1] r=-1 lpr=62 pi=[58,62)/1 crt=48'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 197.036743164s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:42:41 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[11.5( v 48'48 (0'0,48'48] local-lis/les=58/59 n=1 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=11.812460899s) [1] r=-1 lpr=62 pi=[58,62)/1 crt=48'48 lcod 0'0 mlcod 0'0 active pruub 197.036560059s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:41 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[11.5( v 48'48 (0'0,48'48] local-lis/les=58/59 n=1 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=11.812422752s) [1] r=-1 lpr=62 pi=[58,62)/1 crt=48'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 197.036560059s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:42:41 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:41 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb00002f50 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:42:41 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 8.e scrub starts
Dec  6 04:42:41 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 8.e scrub ok
Dec  6 04:42:41 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:41 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb00002f50 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:42:42 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e63 e63: 3 total, 3 up, 3 in
Dec  6 04:42:42 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 63 pg[12.10( empty local-lis/les=62/63 n=0 ec=60/49 lis/c=60/60 les/c/f=61/61/0 sis=62) [0] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:42 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 63 pg[12.c( empty local-lis/les=62/63 n=0 ec=60/49 lis/c=60/60 les/c/f=61/61/0 sis=62) [0] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:42 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 63 pg[12.a( empty local-lis/les=62/63 n=0 ec=60/49 lis/c=60/60 les/c/f=61/61/0 sis=62) [0] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:42 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 63 pg[12.b( empty local-lis/les=62/63 n=0 ec=60/49 lis/c=60/60 les/c/f=61/61/0 sis=62) [0] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:42 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 63 pg[12.e( empty local-lis/les=62/63 n=0 ec=60/49 lis/c=60/60 les/c/f=61/61/0 sis=62) [0] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:42 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 63 pg[12.8( empty local-lis/les=62/63 n=0 ec=60/49 lis/c=60/60 les/c/f=61/61/0 sis=62) [0] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:42 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 63 pg[12.19( empty local-lis/les=62/63 n=0 ec=60/49 lis/c=60/60 les/c/f=61/61/0 sis=62) [0] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:42 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 63 pg[12.6( empty local-lis/les=62/63 n=0 ec=60/49 lis/c=60/60 les/c/f=61/61/0 sis=62) [0] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:42 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 63 pg[12.12( empty local-lis/les=62/63 n=0 ec=60/49 lis/c=60/60 les/c/f=61/61/0 sis=62) [0] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:42 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 63 pg[12.1c( empty local-lis/les=62/63 n=0 ec=60/49 lis/c=60/60 les/c/f=61/61/0 sis=62) [0] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:42 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "3"}]': finished
Dec  6 04:42:42 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pgp_num_actual", "val": "32"}]': finished
Dec  6 04:42:42 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "2"}]': finished
Dec  6 04:42:42 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pgp_num_actual", "val": "32"}]': finished
Dec  6 04:42:42 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 9.c deep-scrub starts
Dec  6 04:42:42 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-1-uzbtlt[85318]: Sat Dec  6 09:42:42 2025: (VI_0) Master received advert from 192.168.122.100 with higher priority 100, ours 90
Dec  6 04:42:42 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-1-uzbtlt[85318]: Sat Dec  6 09:42:42 2025: (VI_0) Entering BACKUP STATE
Dec  6 04:42:42 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 9.c deep-scrub ok
Dec  6 04:42:42 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:42 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb200038d0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:42:42 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e63 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  6 04:42:43 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e64 e64: 3 total, 3 up, 3 in
Dec  6 04:42:43 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 64 pg[10.16( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=64) [0] r=0 lpr=64 pi=[58,64)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:43 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 64 pg[10.2( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=64) [0] r=0 lpr=64 pi=[58,64)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:43 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 64 pg[10.e( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=64) [0] r=0 lpr=64 pi=[58,64)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:43 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 64 pg[10.a( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=64) [0] r=0 lpr=64 pi=[58,64)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:43 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 64 pg[10.6( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=64) [0] r=0 lpr=64 pi=[58,64)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:43 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 64 pg[10.1e( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=64) [0] r=0 lpr=64 pi=[58,64)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:43 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 64 pg[10.12( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=64) [0] r=0 lpr=64 pi=[58,64)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:43 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 64 pg[10.1a( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=64) [0] r=0 lpr=64 pi=[58,64)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:43 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:43 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feafc003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:42:43 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "4"}]: dispatch
Dec  6 04:42:43 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "3"}]: dispatch
Dec  6 04:42:43 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "4"}]': finished
Dec  6 04:42:43 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "3"}]': finished
Dec  6 04:42:43 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 9.0 scrub starts
Dec  6 04:42:43 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 9.0 scrub ok
Dec  6 04:42:43 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:43 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb00002f50 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:42:44 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e65 e65: 3 total, 3 up, 3 in
Dec  6 04:42:44 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 65 pg[10.2( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=65) [0]/[1] r=-1 lpr=65 pi=[58,65)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:44 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 65 pg[10.e( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=65) [0]/[1] r=-1 lpr=65 pi=[58,65)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:44 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 65 pg[10.16( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=65) [0]/[1] r=-1 lpr=65 pi=[58,65)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:44 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 65 pg[10.2( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=65) [0]/[1] r=-1 lpr=65 pi=[58,65)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec  6 04:42:44 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 65 pg[10.6( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=65) [0]/[1] r=-1 lpr=65 pi=[58,65)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:44 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 65 pg[10.e( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=65) [0]/[1] r=-1 lpr=65 pi=[58,65)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec  6 04:42:44 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 65 pg[10.a( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=65) [0]/[1] r=-1 lpr=65 pi=[58,65)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:44 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 65 pg[10.16( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=65) [0]/[1] r=-1 lpr=65 pi=[58,65)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec  6 04:42:44 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 65 pg[10.a( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=65) [0]/[1] r=-1 lpr=65 pi=[58,65)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec  6 04:42:44 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 65 pg[10.6( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=65) [0]/[1] r=-1 lpr=65 pi=[58,65)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec  6 04:42:44 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 65 pg[10.12( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=65) [0]/[1] r=-1 lpr=65 pi=[58,65)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:44 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 65 pg[10.12( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=65) [0]/[1] r=-1 lpr=65 pi=[58,65)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec  6 04:42:44 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 65 pg[10.1a( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=65) [0]/[1] r=-1 lpr=65 pi=[58,65)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:44 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 65 pg[10.1a( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=65) [0]/[1] r=-1 lpr=65 pi=[58,65)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec  6 04:42:44 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 65 pg[10.1e( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=65) [0]/[1] r=-1 lpr=65 pi=[58,65)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:44 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 65 pg[10.1e( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=65) [0]/[1] r=-1 lpr=65 pi=[58,65)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec  6 04:42:44 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:42:44 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:42:44 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:42:44 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:42:44 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:42:44 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 9.1 deep-scrub starts
Dec  6 04:42:44 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 9.1 deep-scrub ok
Dec  6 04:42:44 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:44 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb18002520 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:42:45 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e66 e66: 3 total, 3 up, 3 in
Dec  6 04:42:45 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:45 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb200038d0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:42:45 np0005548916 ceph-mon[79770]: Deploying daemon alertmanager.compute-0 on compute-0
Dec  6 04:42:45 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e67 e67: 3 total, 3 up, 3 in
Dec  6 04:42:45 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 67 pg[10.1a( v 51'1027 (0'0,51'1027] local-lis/les=0/0 n=5 ec=58/45 lis/c=65/58 les/c/f=66/59/0 sis=67) [0] r=0 lpr=67 pi=[58,67)/1 luod=0'0 crt=51'1027 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:45 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 67 pg[10.1a( v 51'1027 (0'0,51'1027] local-lis/les=0/0 n=5 ec=58/45 lis/c=65/58 les/c/f=66/59/0 sis=67) [0] r=0 lpr=67 pi=[58,67)/1 crt=51'1027 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:45 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:45 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feafc003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:42:46 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e68 e68: 3 total, 3 up, 3 in
Dec  6 04:42:46 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 68 pg[10.16( v 51'1027 (0'0,51'1027] local-lis/les=0/0 n=5 ec=58/45 lis/c=65/58 les/c/f=66/59/0 sis=68) [0] r=0 lpr=68 pi=[58,68)/1 luod=0'0 crt=51'1027 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:46 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 68 pg[10.16( v 51'1027 (0'0,51'1027] local-lis/les=0/0 n=5 ec=58/45 lis/c=65/58 les/c/f=66/59/0 sis=68) [0] r=0 lpr=68 pi=[58,68)/1 crt=51'1027 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:46 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 68 pg[10.12( v 51'1027 (0'0,51'1027] local-lis/les=0/0 n=6 ec=58/45 lis/c=65/58 les/c/f=66/59/0 sis=68) [0] r=0 lpr=68 pi=[58,68)/1 luod=0'0 crt=51'1027 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:46 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 68 pg[10.12( v 51'1027 (0'0,51'1027] local-lis/les=0/0 n=6 ec=58/45 lis/c=65/58 les/c/f=66/59/0 sis=68) [0] r=0 lpr=68 pi=[58,68)/1 crt=51'1027 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:46 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 68 pg[10.a( v 51'1027 (0'0,51'1027] local-lis/les=0/0 n=6 ec=58/45 lis/c=65/58 les/c/f=66/59/0 sis=68) [0] r=0 lpr=68 pi=[58,68)/1 luod=0'0 crt=51'1027 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:46 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 68 pg[10.a( v 51'1027 (0'0,51'1027] local-lis/les=0/0 n=6 ec=58/45 lis/c=65/58 les/c/f=66/59/0 sis=68) [0] r=0 lpr=68 pi=[58,68)/1 crt=51'1027 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:46 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 68 pg[10.e( v 51'1027 (0'0,51'1027] local-lis/les=0/0 n=6 ec=58/45 lis/c=65/58 les/c/f=66/59/0 sis=68) [0] r=0 lpr=68 pi=[58,68)/1 luod=0'0 crt=51'1027 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:46 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 68 pg[10.e( v 51'1027 (0'0,51'1027] local-lis/les=0/0 n=6 ec=58/45 lis/c=65/58 les/c/f=66/59/0 sis=68) [0] r=0 lpr=68 pi=[58,68)/1 crt=51'1027 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:46 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 68 pg[10.2( v 51'1027 (0'0,51'1027] local-lis/les=0/0 n=6 ec=58/45 lis/c=65/58 les/c/f=66/59/0 sis=68) [0] r=0 lpr=68 pi=[58,68)/1 luod=0'0 crt=51'1027 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:46 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 68 pg[10.2( v 51'1027 (0'0,51'1027] local-lis/les=0/0 n=6 ec=58/45 lis/c=65/58 les/c/f=66/59/0 sis=68) [0] r=0 lpr=68 pi=[58,68)/1 crt=51'1027 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:46 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 68 pg[10.6( v 51'1027 (0'0,51'1027] local-lis/les=0/0 n=6 ec=58/45 lis/c=65/58 les/c/f=66/59/0 sis=68) [0] r=0 lpr=68 pi=[58,68)/1 luod=0'0 crt=51'1027 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:46 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 68 pg[10.6( v 51'1027 (0'0,51'1027] local-lis/les=0/0 n=6 ec=58/45 lis/c=65/58 les/c/f=66/59/0 sis=68) [0] r=0 lpr=68 pi=[58,68)/1 crt=51'1027 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:46 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 68 pg[10.1e( v 51'1027 (0'0,51'1027] local-lis/les=0/0 n=5 ec=58/45 lis/c=65/58 les/c/f=66/59/0 sis=68) [0] r=0 lpr=68 pi=[58,68)/1 luod=0'0 crt=51'1027 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:46 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 68 pg[10.1e( v 51'1027 (0'0,51'1027] local-lis/les=0/0 n=5 ec=58/45 lis/c=65/58 les/c/f=66/59/0 sis=68) [0] r=0 lpr=68 pi=[58,68)/1 crt=51'1027 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:46 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 68 pg[10.1a( v 51'1027 (0'0,51'1027] local-lis/les=67/68 n=5 ec=58/45 lis/c=65/58 les/c/f=66/59/0 sis=67) [0] r=0 lpr=67 pi=[58,67)/1 crt=51'1027 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:46 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 10.1a scrub starts
Dec  6 04:42:46 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 10.1a scrub ok
Dec  6 04:42:46 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:46 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb00004050 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:42:47 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:47 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb18002520 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:42:47 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e69 e69: 3 total, 3 up, 3 in
Dec  6 04:42:47 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 69 pg[10.16( v 51'1027 (0'0,51'1027] local-lis/les=68/69 n=5 ec=58/45 lis/c=65/58 les/c/f=66/59/0 sis=68) [0] r=0 lpr=68 pi=[58,68)/1 crt=51'1027 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:47 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 69 pg[10.a( v 51'1027 (0'0,51'1027] local-lis/les=68/69 n=6 ec=58/45 lis/c=65/58 les/c/f=66/59/0 sis=68) [0] r=0 lpr=68 pi=[58,68)/1 crt=51'1027 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:47 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 69 pg[10.e( v 51'1027 (0'0,51'1027] local-lis/les=68/69 n=6 ec=58/45 lis/c=65/58 les/c/f=66/59/0 sis=68) [0] r=0 lpr=68 pi=[58,68)/1 crt=51'1027 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:47 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 69 pg[10.2( v 51'1027 (0'0,51'1027] local-lis/les=68/69 n=6 ec=58/45 lis/c=65/58 les/c/f=66/59/0 sis=68) [0] r=0 lpr=68 pi=[58,68)/1 crt=51'1027 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:47 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 69 pg[10.6( v 51'1027 (0'0,51'1027] local-lis/les=68/69 n=6 ec=58/45 lis/c=65/58 les/c/f=66/59/0 sis=68) [0] r=0 lpr=68 pi=[58,68)/1 crt=51'1027 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:47 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 69 pg[10.1e( v 51'1027 (0'0,51'1027] local-lis/les=68/69 n=5 ec=58/45 lis/c=65/58 les/c/f=66/59/0 sis=68) [0] r=0 lpr=68 pi=[58,68)/1 crt=51'1027 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:47 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 69 pg[10.12( v 51'1027 (0'0,51'1027] local-lis/les=68/69 n=6 ec=58/45 lis/c=65/58 les/c/f=66/59/0 sis=68) [0] r=0 lpr=68 pi=[58,68)/1 crt=51'1027 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:47 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 10.16 scrub starts
Dec  6 04:42:47 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 10.16 scrub ok
Dec  6 04:42:47 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:47 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb200038d0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:42:47 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e69 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  6 04:42:48 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 10.e scrub starts
Dec  6 04:42:48 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 10.e scrub ok
Dec  6 04:42:48 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:48 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feafc003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:42:49 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:49 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb00004050 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:42:49 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 10.a deep-scrub starts
Dec  6 04:42:49 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 10.a deep-scrub ok
Dec  6 04:42:49 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:49 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feaf4000b60 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:42:50 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:42:50 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:42:50 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:42:50 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:42:50 np0005548916 ceph-mon[79770]: Regenerating cephadm self-signed grafana TLS certificates
Dec  6 04:42:50 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:42:50 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:42:50 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "dashboard set-grafana-api-ssl-verify", "value": "false"}]: dispatch
Dec  6 04:42:50 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:42:50 np0005548916 ceph-mon[79770]: Deploying daemon grafana.compute-0 on compute-0
Dec  6 04:42:50 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 10.2 scrub starts
Dec  6 04:42:50 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 10.2 scrub ok
Dec  6 04:42:50 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:50 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb24002010 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:42:51 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:51 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb200045e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:42:51 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e70 e70: 3 total, 3 up, 3 in
Dec  6 04:42:51 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 8.1 scrub starts
Dec  6 04:42:51 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 8.1 scrub ok
Dec  6 04:42:51 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:42:51 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "5"}]: dispatch
Dec  6 04:42:51 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "4"}]: dispatch
Dec  6 04:42:51 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:51 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb00004050 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:42:52 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 8.0 scrub starts
Dec  6 04:42:52 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 8.0 scrub ok
Dec  6 04:42:52 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "5"}]': finished
Dec  6 04:42:52 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "4"}]': finished
Dec  6 04:42:52 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:52 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb00004050 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:42:52 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e70 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  6 04:42:53 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:53 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb24002010 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:42:53 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 8.7 scrub starts
Dec  6 04:42:53 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 8.7 scrub ok
Dec  6 04:42:53 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e71 e71: 3 total, 3 up, 3 in
Dec  6 04:42:53 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "6"}]: dispatch
Dec  6 04:42:53 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "5"}]: dispatch
Dec  6 04:42:53 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:53 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb200045e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:42:54 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 9.1a scrub starts
Dec  6 04:42:54 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 9.1a scrub ok
Dec  6 04:42:54 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:54 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feaf40016a0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:42:55 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e72 e72: 3 total, 3 up, 3 in
Dec  6 04:42:55 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "6"}]': finished
Dec  6 04:42:55 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "5"}]': finished
Dec  6 04:42:55 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:55 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb00004050 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:42:55 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 8.1a scrub starts
Dec  6 04:42:55 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 8.1a scrub ok
Dec  6 04:42:55 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:55 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb24002010 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:42:56 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e73 e73: 3 total, 3 up, 3 in
Dec  6 04:42:56 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 9.1b scrub starts
Dec  6 04:42:56 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 9.1b scrub ok
Dec  6 04:42:56 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:56 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb200045e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:42:57 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e74 e74: 3 total, 3 up, 3 in
Dec  6 04:42:57 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:42:57 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:42:57 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:42:57 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:42:57 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:42:57 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:57 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feaf4001fc0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:42:57 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 8.1e deep-scrub starts
Dec  6 04:42:57 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 8.1e deep-scrub ok
Dec  6 04:42:57 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:57 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb00004050 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:42:57 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e74 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  6 04:42:58 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 9.1f scrub starts
Dec  6 04:42:58 np0005548916 ceph-mon[79770]: Deploying daemon haproxy.rgw.default.compute-0.vhqyer on compute-0
Dec  6 04:42:58 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 9.1f scrub ok
Dec  6 04:42:58 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e75 e75: 3 total, 3 up, 3 in
Dec  6 04:42:58 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:58 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb24002010 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:42:59 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 8.1d scrub starts
Dec  6 04:42:59 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:59 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb200045e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:42:59 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 8.1d scrub ok
Dec  6 04:42:59 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:59 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feaf4001fc0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:42:59 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:42:59 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.004000096s ======
Dec  6 04:42:59 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:42:59.902 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.004000096s
Dec  6 04:43:00 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 9.1c scrub starts
Dec  6 04:43:00 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 9.1c scrub ok
Dec  6 04:43:00 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:43:00 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:43:00 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:43:00 np0005548916 ceph-mon[79770]: Deploying daemon haproxy.rgw.default.compute-2.mwbfro on compute-2
Dec  6 04:43:00 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:43:00 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb00004050 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:01 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:43:01 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb240091b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:01 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 8.13 scrub starts
Dec  6 04:43:01 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 8.13 scrub ok
Dec  6 04:43:01 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:43:01 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb200045e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:01 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:43:01 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 04:43:01 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:43:01.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 04:43:02 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 9.4 scrub starts
Dec  6 04:43:02 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 9.4 scrub ok
Dec  6 04:43:02 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:43:02 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feaf4001fc0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:02 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e75 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  6 04:43:02 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:43:02 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "7"}]: dispatch
Dec  6 04:43:02 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "6"}]: dispatch
Dec  6 04:43:02 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e76 e76: 3 total, 3 up, 3 in
Dec  6 04:43:03 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 76 pg[10.d( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=65/65 les/c/f=66/66/0 sis=76) [0] r=0 lpr=76 pi=[65,76)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:43:03 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 76 pg[10.1d( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=65/65 les/c/f=66/66/0 sis=76) [0] r=0 lpr=76 pi=[65,76)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:43:03 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 76 pg[10.5( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=65/65 les/c/f=66/66/0 sis=76) [0] r=0 lpr=76 pi=[65,76)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:43:03 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 76 pg[10.15( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=65/65 les/c/f=66/66/0 sis=76) [0] r=0 lpr=76 pi=[65,76)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:43:03 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:43:03 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb00004050 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:03 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 9.19 scrub starts
Dec  6 04:43:03 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 9.19 scrub ok
Dec  6 04:43:03 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:43:03 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 04:43:03 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:43:03.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 04:43:03 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:43:03 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb240091b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:03 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:43:03 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 04:43:03 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:43:03.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 04:43:04 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 9.1e scrub starts
Dec  6 04:43:04 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 9.1e scrub ok
Dec  6 04:43:04 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:43:04 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb200045e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:05 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 7.1b scrub starts
Dec  6 04:43:05 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:43:05 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb200045e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:05 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 7.1b scrub ok
Dec  6 04:43:05 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:43:05 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 04:43:05 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:43:05.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 04:43:05 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:43:05 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb00004050 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:05 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 76 pg[6.e( empty local-lis/les=0/0 n=0 ec=54/21 lis/c=62/62 les/c/f=63/63/0 sis=76) [0] r=0 lpr=76 pi=[62,76)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:43:05 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 76 pg[6.6( empty local-lis/les=0/0 n=0 ec=54/21 lis/c=62/62 les/c/f=63/63/0 sis=76) [0] r=0 lpr=76 pi=[62,76)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:43:05 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "7"}]': finished
Dec  6 04:43:05 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "6"}]': finished
Dec  6 04:43:05 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "8"}]: dispatch
Dec  6 04:43:05 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "7"}]: dispatch
Dec  6 04:43:05 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:43:05 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:43:05 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:43:05 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:43:05 np0005548916 ceph-mon[79770]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Dec  6 04:43:05 np0005548916 ceph-mon[79770]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Dec  6 04:43:05 np0005548916 ceph-mon[79770]: Deploying daemon keepalived.rgw.default.compute-0.mycoxk on compute-0
Dec  6 04:43:05 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:43:05 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:43:05 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:43:05.918 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:43:05 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e77 e77: 3 total, 3 up, 3 in
Dec  6 04:43:05 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 77 pg[10.16( v 51'1027 (0'0,51'1027] local-lis/les=68/69 n=4 ec=58/45 lis/c=68/68 les/c/f=69/69/0 sis=77 pruub=13.425968170s) [1] r=-1 lpr=77 pi=[68,77)/1 crt=51'1027 mlcod 0'0 active pruub 223.407394409s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:43:05 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 77 pg[10.16( v 51'1027 (0'0,51'1027] local-lis/les=68/69 n=4 ec=58/45 lis/c=68/68 les/c/f=69/69/0 sis=77 pruub=13.425865173s) [1] r=-1 lpr=77 pi=[68,77)/1 crt=51'1027 mlcod 0'0 unknown NOTIFY pruub 223.407394409s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:43:05 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 77 pg[10.d( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=65/65 les/c/f=66/66/0 sis=77) [0]/[2] r=-1 lpr=77 pi=[65,77)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:43:05 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 77 pg[10.d( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=65/65 les/c/f=66/66/0 sis=77) [0]/[2] r=-1 lpr=77 pi=[65,77)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec  6 04:43:05 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 77 pg[10.e( v 51'1027 (0'0,51'1027] local-lis/les=68/69 n=5 ec=58/45 lis/c=68/68 les/c/f=69/69/0 sis=77 pruub=13.427437782s) [1] r=-1 lpr=77 pi=[68,77)/1 crt=51'1027 mlcod 0'0 active pruub 223.410537720s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:43:05 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 77 pg[10.e( v 51'1027 (0'0,51'1027] local-lis/les=68/69 n=5 ec=58/45 lis/c=68/68 les/c/f=69/69/0 sis=77 pruub=13.427408218s) [1] r=-1 lpr=77 pi=[68,77)/1 crt=51'1027 mlcod 0'0 unknown NOTIFY pruub 223.410537720s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:43:05 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 77 pg[10.5( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=65/65 les/c/f=66/66/0 sis=77) [0]/[2] r=-1 lpr=77 pi=[65,77)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:43:05 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 77 pg[10.1d( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=65/65 les/c/f=66/66/0 sis=77) [0]/[2] r=-1 lpr=77 pi=[65,77)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:43:05 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 77 pg[10.6( v 51'1027 (0'0,51'1027] local-lis/les=68/69 n=6 ec=58/45 lis/c=68/68 les/c/f=69/69/0 sis=77 pruub=13.426686287s) [1] r=-1 lpr=77 pi=[68,77)/1 crt=51'1027 mlcod 0'0 active pruub 223.410598755s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:43:05 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 77 pg[10.1d( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=65/65 les/c/f=66/66/0 sis=77) [0]/[2] r=-1 lpr=77 pi=[65,77)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec  6 04:43:05 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 77 pg[10.6( v 51'1027 (0'0,51'1027] local-lis/les=68/69 n=6 ec=58/45 lis/c=68/68 les/c/f=69/69/0 sis=77 pruub=13.426582336s) [1] r=-1 lpr=77 pi=[68,77)/1 crt=51'1027 mlcod 0'0 unknown NOTIFY pruub 223.410598755s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:43:05 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 77 pg[10.1e( v 51'1027 (0'0,51'1027] local-lis/les=68/69 n=5 ec=58/45 lis/c=68/68 les/c/f=69/69/0 sis=77 pruub=13.426499367s) [1] r=-1 lpr=77 pi=[68,77)/1 crt=51'1027 mlcod 0'0 active pruub 223.410690308s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:43:05 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 77 pg[10.1e( v 51'1027 (0'0,51'1027] local-lis/les=68/69 n=5 ec=58/45 lis/c=68/68 les/c/f=69/69/0 sis=77 pruub=13.426470757s) [1] r=-1 lpr=77 pi=[68,77)/1 crt=51'1027 mlcod 0'0 unknown NOTIFY pruub 223.410690308s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:43:05 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 77 pg[10.15( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=65/65 les/c/f=66/66/0 sis=77) [0]/[2] r=-1 lpr=77 pi=[65,77)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:43:05 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 77 pg[10.15( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=65/65 les/c/f=66/66/0 sis=77) [0]/[2] r=-1 lpr=77 pi=[65,77)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec  6 04:43:05 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 77 pg[10.5( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=65/65 les/c/f=66/66/0 sis=77) [0]/[2] r=-1 lpr=77 pi=[65,77)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec  6 04:43:05 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 77 pg[6.e( v 50'39 lc 48'19 (0'0,50'39] local-lis/les=76/77 n=1 ec=54/21 lis/c=62/62 les/c/f=63/63/0 sis=76) [0] r=0 lpr=76 pi=[62,76)/1 crt=50'39 lcod 0'0 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:43:05 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 77 pg[6.6( v 50'39 lc 0'0 (0'0,50'39] local-lis/les=76/77 n=1 ec=54/21 lis/c=62/62 les/c/f=63/63/0 sis=76) [0] r=0 lpr=76 pi=[62,76)/1 crt=50'39 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:43:06 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 7.6 scrub starts
Dec  6 04:43:06 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 7.6 scrub ok
Dec  6 04:43:06 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:43:06 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb240091b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:06 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "8"}]': finished
Dec  6 04:43:06 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "7"}]': finished
Dec  6 04:43:06 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e78 e78: 3 total, 3 up, 3 in
Dec  6 04:43:06 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 78 pg[10.1e( v 51'1027 (0'0,51'1027] local-lis/les=68/69 n=5 ec=58/45 lis/c=68/68 les/c/f=69/69/0 sis=78) [1]/[0] r=0 lpr=78 pi=[68,78)/1 crt=51'1027 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:43:06 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 78 pg[10.6( v 51'1027 (0'0,51'1027] local-lis/les=68/69 n=6 ec=58/45 lis/c=68/68 les/c/f=69/69/0 sis=78) [1]/[0] r=0 lpr=78 pi=[68,78)/1 crt=51'1027 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:43:06 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 78 pg[10.1e( v 51'1027 (0'0,51'1027] local-lis/les=68/69 n=5 ec=58/45 lis/c=68/68 les/c/f=69/69/0 sis=78) [1]/[0] r=0 lpr=78 pi=[68,78)/1 crt=51'1027 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Dec  6 04:43:06 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 78 pg[10.6( v 51'1027 (0'0,51'1027] local-lis/les=68/69 n=6 ec=58/45 lis/c=68/68 les/c/f=69/69/0 sis=78) [1]/[0] r=0 lpr=78 pi=[68,78)/1 crt=51'1027 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Dec  6 04:43:06 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 78 pg[10.e( v 51'1027 (0'0,51'1027] local-lis/les=68/69 n=5 ec=58/45 lis/c=68/68 les/c/f=69/69/0 sis=78) [1]/[0] r=0 lpr=78 pi=[68,78)/1 crt=51'1027 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:43:06 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 78 pg[10.e( v 51'1027 (0'0,51'1027] local-lis/les=68/69 n=5 ec=58/45 lis/c=68/68 les/c/f=69/69/0 sis=78) [1]/[0] r=0 lpr=78 pi=[68,78)/1 crt=51'1027 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Dec  6 04:43:06 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 78 pg[10.16( v 51'1027 (0'0,51'1027] local-lis/les=68/69 n=4 ec=58/45 lis/c=68/68 les/c/f=69/69/0 sis=78) [1]/[0] r=0 lpr=78 pi=[68,78)/1 crt=51'1027 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:43:06 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 78 pg[10.16( v 51'1027 (0'0,51'1027] local-lis/les=68/69 n=4 ec=58/45 lis/c=68/68 les/c/f=69/69/0 sis=78) [1]/[0] r=0 lpr=78 pi=[68,78)/1 crt=51'1027 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Dec  6 04:43:07 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 7.2 scrub starts
Dec  6 04:43:07 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 7.2 scrub ok
Dec  6 04:43:07 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:43:07 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feaf40032f0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:07 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:43:07 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 04:43:07 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:43:07.720 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 04:43:07 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:43:07 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb200045e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:07 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e78 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  6 04:43:07 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:43:07 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:43:07 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:43:07.921 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:43:07 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:43:07 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:43:07 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:43:07 np0005548916 ceph-mon[79770]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Dec  6 04:43:07 np0005548916 ceph-mon[79770]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Dec  6 04:43:07 np0005548916 ceph-mon[79770]: Deploying daemon keepalived.rgw.default.compute-2.yurwwh on compute-2
Dec  6 04:43:07 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e79 e79: 3 total, 3 up, 3 in
Dec  6 04:43:07 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 79 pg[10.d( v 51'1027 (0'0,51'1027] local-lis/les=0/0 n=6 ec=58/45 lis/c=77/65 les/c/f=78/66/0 sis=79) [0] r=0 lpr=79 pi=[65,79)/1 luod=0'0 crt=51'1027 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:43:07 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 79 pg[10.d( v 51'1027 (0'0,51'1027] local-lis/les=0/0 n=6 ec=58/45 lis/c=77/65 les/c/f=78/66/0 sis=79) [0] r=0 lpr=79 pi=[65,79)/1 crt=51'1027 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:43:07 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 79 pg[10.5( v 78'1042 (0'0,78'1042] local-lis/les=0/0 n=6 ec=58/45 lis/c=77/65 les/c/f=78/66/0 sis=79) [0] r=0 lpr=79 pi=[65,79)/1 luod=0'0 crt=66'1039 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:43:07 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 79 pg[10.5( v 78'1042 (0'0,78'1042] local-lis/les=0/0 n=6 ec=58/45 lis/c=77/65 les/c/f=78/66/0 sis=79) [0] r=0 lpr=79 pi=[65,79)/1 crt=66'1039 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:43:07 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 79 pg[10.15( v 51'1027 (0'0,51'1027] local-lis/les=0/0 n=5 ec=58/45 lis/c=77/65 les/c/f=78/66/0 sis=79) [0] r=0 lpr=79 pi=[65,79)/1 luod=0'0 crt=51'1027 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:43:07 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 79 pg[10.15( v 51'1027 (0'0,51'1027] local-lis/les=0/0 n=5 ec=58/45 lis/c=77/65 les/c/f=78/66/0 sis=79) [0] r=0 lpr=79 pi=[65,79)/1 crt=51'1027 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:43:07 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 79 pg[10.1d( v 51'1027 (0'0,51'1027] local-lis/les=0/0 n=5 ec=58/45 lis/c=77/65 les/c/f=78/66/0 sis=79) [0] r=0 lpr=79 pi=[65,79)/1 luod=0'0 crt=51'1027 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:43:07 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 79 pg[10.1d( v 51'1027 (0'0,51'1027] local-lis/les=0/0 n=5 ec=58/45 lis/c=77/65 les/c/f=78/66/0 sis=79) [0] r=0 lpr=79 pi=[65,79)/1 crt=51'1027 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:43:07 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 79 pg[10.1e( v 51'1027 (0'0,51'1027] local-lis/les=78/79 n=5 ec=58/45 lis/c=68/68 les/c/f=69/69/0 sis=78) [1]/[0] async=[1] r=0 lpr=78 pi=[68,78)/1 crt=51'1027 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:43:07 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 79 pg[10.6( v 51'1027 (0'0,51'1027] local-lis/les=78/79 n=6 ec=58/45 lis/c=68/68 les/c/f=69/69/0 sis=78) [1]/[0] async=[1] r=0 lpr=78 pi=[68,78)/1 crt=51'1027 mlcod 0'0 active+remapped mbc={255={(0+1)=6}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:43:07 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 79 pg[10.e( v 51'1027 (0'0,51'1027] local-lis/les=78/79 n=5 ec=58/45 lis/c=68/68 les/c/f=69/69/0 sis=78) [1]/[0] async=[1] r=0 lpr=78 pi=[68,78)/1 crt=51'1027 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:43:08 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 79 pg[10.16( v 51'1027 (0'0,51'1027] local-lis/les=78/79 n=4 ec=58/45 lis/c=68/68 les/c/f=69/69/0 sis=78) [1]/[0] async=[1] r=0 lpr=78 pi=[68,78)/1 crt=51'1027 mlcod 0'0 active+remapped mbc={255={(0+1)=4}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:43:08 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 7.3 deep-scrub starts
Dec  6 04:43:08 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 7.3 deep-scrub ok
Dec  6 04:43:08 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:43:08 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb00004050 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:08 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e80 e80: 3 total, 3 up, 3 in
Dec  6 04:43:08 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 80 pg[10.16( v 51'1027 (0'0,51'1027] local-lis/les=78/79 n=4 ec=58/45 lis/c=78/68 les/c/f=79/69/0 sis=80 pruub=15.006292343s) [1] async=[1] r=-1 lpr=80 pi=[68,80)/1 crt=51'1027 mlcod 51'1027 active pruub 228.041992188s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:43:08 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 80 pg[10.e( v 51'1027 (0'0,51'1027] local-lis/les=78/79 n=5 ec=58/45 lis/c=78/68 les/c/f=79/69/0 sis=80 pruub=15.000350952s) [1] async=[1] r=-1 lpr=80 pi=[68,80)/1 crt=51'1027 mlcod 51'1027 active pruub 228.036636353s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:43:08 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 80 pg[10.e( v 51'1027 (0'0,51'1027] local-lis/les=78/79 n=5 ec=58/45 lis/c=78/68 les/c/f=79/69/0 sis=80 pruub=15.000283241s) [1] r=-1 lpr=80 pi=[68,80)/1 crt=51'1027 mlcod 0'0 unknown NOTIFY pruub 228.036636353s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:43:08 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 80 pg[10.16( v 51'1027 (0'0,51'1027] local-lis/les=78/79 n=4 ec=58/45 lis/c=78/68 les/c/f=79/69/0 sis=80 pruub=15.005999565s) [1] r=-1 lpr=80 pi=[68,80)/1 crt=51'1027 mlcod 0'0 unknown NOTIFY pruub 228.041992188s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:43:08 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 80 pg[10.6( v 51'1027 (0'0,51'1027] local-lis/les=78/79 n=6 ec=58/45 lis/c=78/68 les/c/f=79/69/0 sis=80 pruub=14.998921394s) [1] async=[1] r=-1 lpr=80 pi=[68,80)/1 crt=51'1027 mlcod 51'1027 active pruub 228.036636353s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:43:08 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 80 pg[10.6( v 51'1027 (0'0,51'1027] local-lis/les=78/79 n=6 ec=58/45 lis/c=78/68 les/c/f=79/69/0 sis=80 pruub=14.998816490s) [1] r=-1 lpr=80 pi=[68,80)/1 crt=51'1027 mlcod 0'0 unknown NOTIFY pruub 228.036636353s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:43:09 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 80 pg[10.1e( v 51'1027 (0'0,51'1027] local-lis/les=78/79 n=5 ec=58/45 lis/c=78/68 les/c/f=79/69/0 sis=80 pruub=14.998106003s) [1] async=[1] r=-1 lpr=80 pi=[68,80)/1 crt=51'1027 mlcod 51'1027 active pruub 228.036529541s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:43:09 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 80 pg[10.1e( v 51'1027 (0'0,51'1027] local-lis/les=78/79 n=5 ec=58/45 lis/c=78/68 les/c/f=79/69/0 sis=80 pruub=14.997945786s) [1] r=-1 lpr=80 pi=[68,80)/1 crt=51'1027 mlcod 0'0 unknown NOTIFY pruub 228.036529541s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:43:09 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 80 pg[10.d( v 51'1027 (0'0,51'1027] local-lis/les=79/80 n=6 ec=58/45 lis/c=77/65 les/c/f=78/66/0 sis=79) [0] r=0 lpr=79 pi=[65,79)/1 crt=51'1027 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:43:09 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 80 pg[10.5( v 78'1042 (0'0,78'1042] local-lis/les=79/80 n=6 ec=58/45 lis/c=77/65 les/c/f=78/66/0 sis=79) [0] r=0 lpr=79 pi=[65,79)/1 crt=78'1042 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:43:09 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 80 pg[10.15( v 51'1027 (0'0,51'1027] local-lis/les=79/80 n=5 ec=58/45 lis/c=77/65 les/c/f=78/66/0 sis=79) [0] r=0 lpr=79 pi=[65,79)/1 crt=51'1027 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:43:09 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 80 pg[10.1d( v 51'1027 (0'0,51'1027] local-lis/les=79/80 n=5 ec=58/45 lis/c=77/65 les/c/f=78/66/0 sis=79) [0] r=0 lpr=79 pi=[65,79)/1 crt=51'1027 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:43:09 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 7.4 scrub starts
Dec  6 04:43:09 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 7.4 scrub ok
Dec  6 04:43:09 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:43:09 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb2400a640 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:09 np0005548916 systemd-logind[788]: New session 36 of user zuul.
Dec  6 04:43:09 np0005548916 systemd[1]: Started Session 36 of User zuul.
Dec  6 04:43:09 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:43:09 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:43:09 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:43:09.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:43:09 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:43:09 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feaf40032f0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:09 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:43:09 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:43:09 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:43:09.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:43:10 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:43:10 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:43:10 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:43:10 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:43:10 np0005548916 ceph-mon[79770]: Deploying daemon prometheus.compute-0 on compute-0
Dec  6 04:43:10 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e81 e81: 3 total, 3 up, 3 in
Dec  6 04:43:10 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 7.e scrub starts
Dec  6 04:43:10 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 7.e scrub ok
Dec  6 04:43:10 np0005548916 python3.9[85503]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  6 04:43:10 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:43:10 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb200045e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:11 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:43:11 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "9"}]: dispatch
Dec  6 04:43:11 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "8"}]: dispatch
Dec  6 04:43:11 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 7.f scrub starts
Dec  6 04:43:11 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 7.f scrub ok
Dec  6 04:43:11 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:43:11 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb00004050 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:11 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e82 e82: 3 total, 3 up, 3 in
Dec  6 04:43:11 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 82 pg[6.8( v 50'39 (0'0,50'39] local-lis/les=54/55 n=0 ec=54/21 lis/c=54/54 les/c/f=55/55/0 sis=82 pruub=9.370536804s) [1] r=-1 lpr=82 pi=[54,82)/1 crt=50'39 lcod 0'0 mlcod 0'0 active pruub 224.791793823s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:43:11 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 82 pg[6.8( v 50'39 (0'0,50'39] local-lis/les=54/55 n=0 ec=54/21 lis/c=54/54 les/c/f=55/55/0 sis=82 pruub=9.370141029s) [1] r=-1 lpr=82 pi=[54,82)/1 crt=50'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 224.791793823s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:43:11 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:43:11 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:43:11 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:43:11.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:43:11 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:43:11 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb2400a640 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:11 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:43:11 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:43:11 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:43:11.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:43:12 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "9"}]': finished
Dec  6 04:43:12 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "8"}]': finished
Dec  6 04:43:12 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 7.8 scrub starts
Dec  6 04:43:12 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 7.8 scrub ok
Dec  6 04:43:12 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e83 e83: 3 total, 3 up, 3 in
Dec  6 04:43:12 np0005548916 python3.9[85718]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail#012pushd /var/tmp#012curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz#012pushd repo-setup-main#012python3 -m venv ./venv#012PBR_VERSION=0.0.0 ./venv/bin/pip install ./#012./venv/bin/repo-setup current-podified -b antelope#012popd#012rm -rf repo-setup-main#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:43:12 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:43:12 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feaf4004000 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:12 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e83 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  6 04:43:13 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "10"}]: dispatch
Dec  6 04:43:13 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "9"}]: dispatch
Dec  6 04:43:13 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 7.b scrub starts
Dec  6 04:43:13 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:43:13 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feaf4004000 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:13 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 7.b scrub ok
Dec  6 04:43:13 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e84 e84: 3 total, 3 up, 3 in
Dec  6 04:43:13 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 84 pg[6.9( empty local-lis/les=0/0 n=0 ec=54/21 lis/c=58/58 les/c/f=59/59/0 sis=84) [0] r=0 lpr=84 pi=[58,84)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:43:13 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:43:13 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:43:13 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:43:13.732 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:43:13 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:43:13 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb200045e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:13 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:43:13 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:43:13 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:43:13.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:43:14 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 84 pg[10.18( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=84) [0] r=0 lpr=84 pi=[58,84)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:43:14 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 84 pg[10.8( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=84) [0] r=0 lpr=84 pi=[58,84)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:43:14 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 7.13 scrub starts
Dec  6 04:43:14 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 7.13 scrub ok
Dec  6 04:43:14 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "10"}]': finished
Dec  6 04:43:14 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "9"}]': finished
Dec  6 04:43:14 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e85 e85: 3 total, 3 up, 3 in
Dec  6 04:43:14 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 85 pg[10.18( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=85) [0]/[1] r=-1 lpr=85 pi=[58,85)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:43:14 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 85 pg[10.18( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=85) [0]/[1] r=-1 lpr=85 pi=[58,85)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec  6 04:43:14 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 85 pg[10.8( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=85) [0]/[1] r=-1 lpr=85 pi=[58,85)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:43:14 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 85 pg[10.8( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=85) [0]/[1] r=-1 lpr=85 pi=[58,85)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec  6 04:43:14 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 85 pg[6.9( v 50'39 (0'0,50'39] local-lis/les=84/85 n=0 ec=54/21 lis/c=58/58 les/c/f=59/59/0 sis=84) [0] r=0 lpr=84 pi=[58,84)/1 crt=50'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:43:14 np0005548916 ceph-mon[79770]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #13. Immutable memtables: 0.
Dec  6 04:43:14 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:43:14.515759) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec  6 04:43:14 np0005548916 ceph-mon[79770]: rocksdb: [db/flush_job.cc:856] [default] [JOB 3] Flushing memtable with next log file: 13
Dec  6 04:43:14 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014194516128, "job": 3, "event": "flush_started", "num_memtables": 1, "num_entries": 7353, "num_deletes": 255, "total_data_size": 21105504, "memory_usage": 21943440, "flush_reason": "Manual Compaction"}
Dec  6 04:43:14 np0005548916 ceph-mon[79770]: rocksdb: [db/flush_job.cc:885] [default] [JOB 3] Level-0 flush table #14: started
Dec  6 04:43:14 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014194616079, "cf_name": "default", "job": 3, "event": "table_file_creation", "file_number": 14, "file_size": 13003679, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 249, "largest_seqno": 7358, "table_properties": {"data_size": 12975211, "index_size": 18113, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 9285, "raw_key_size": 90236, "raw_average_key_size": 24, "raw_value_size": 12903944, "raw_average_value_size": 3480, "num_data_blocks": 803, "num_entries": 3707, "num_filter_entries": 3707, "num_deletions": 255, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765014012, "oldest_key_time": 1765014012, "file_creation_time": 1765014194, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79d3ff52-4bd2-4fdc-8b55-33dd9c53d8e0", "db_session_id": "1TK25AVRA1WQDS4JHM8T", "orig_file_number": 14, "seqno_to_time_mapping": "N/A"}}
Dec  6 04:43:14 np0005548916 ceph-mon[79770]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 3] Flush lasted 100373 microseconds, and 49877 cpu microseconds.
Dec  6 04:43:14 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:43:14.616249) [db/flush_job.cc:967] [default] [JOB 3] Level-0 flush table #14: 13003679 bytes OK
Dec  6 04:43:14 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:43:14.616287) [db/memtable_list.cc:519] [default] Level-0 commit table #14 started
Dec  6 04:43:14 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:43:14.618940) [db/memtable_list.cc:722] [default] Level-0 commit table #14: memtable #1 done
Dec  6 04:43:14 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:43:14.618979) EVENT_LOG_v1 {"time_micros": 1765014194618972, "job": 3, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [2, 0, 0, 0, 0, 0, 0], "immutable_memtables": 0}
Dec  6 04:43:14 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:43:14.619004) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: files[2 0 0 0 0 0 0] max score 0.50
Dec  6 04:43:14 np0005548916 ceph-mon[79770]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 3] Try to delete WAL files size 21065478, prev total WAL file size 21065478, number of live WAL files 2.
Dec  6 04:43:14 np0005548916 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000009.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 04:43:14 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:43:14.623606) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730030' seq:72057594037927935, type:22 .. '7061786F7300323532' seq:0, type:0; will stop at (end)
Dec  6 04:43:14 np0005548916 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 4] Compacting 2@0 files to L6, score -1.00
Dec  6 04:43:14 np0005548916 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 3 Base level 0, inputs: [14(12MB) 8(1648B)]
Dec  6 04:43:14 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014194624103, "job": 4, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [14, 8], "score": -1, "input_data_size": 13005327, "oldest_snapshot_seqno": -1}
Dec  6 04:43:14 np0005548916 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 4] Generated table #15: 3456 keys, 13000182 bytes, temperature: kUnknown
Dec  6 04:43:14 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:43:14 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feaf4004000 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:14 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014194746384, "cf_name": "default", "job": 4, "event": "table_file_creation", "file_number": 15, "file_size": 13000182, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12972411, "index_size": 18061, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 8645, "raw_key_size": 86058, "raw_average_key_size": 24, "raw_value_size": 12904224, "raw_average_value_size": 3733, "num_data_blocks": 801, "num_entries": 3456, "num_filter_entries": 3456, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765014012, "oldest_key_time": 0, "file_creation_time": 1765014194, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79d3ff52-4bd2-4fdc-8b55-33dd9c53d8e0", "db_session_id": "1TK25AVRA1WQDS4JHM8T", "orig_file_number": 15, "seqno_to_time_mapping": "N/A"}}
Dec  6 04:43:14 np0005548916 ceph-mon[79770]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  6 04:43:14 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:43:14.746667) [db/compaction/compaction_job.cc:1663] [default] [JOB 4] Compacted 2@0 files to L6 => 13000182 bytes
Dec  6 04:43:14 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:43:14.747960) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 106.4 rd, 106.4 wr, level 6, files in(2, 0) out(1 +0 blob) MB in(12.4, 0.0 +0.0 blob) out(12.4 +0.0 blob), read-write-amplify(2.0) write-amplify(1.0) OK, records in: 3712, records dropped: 256 output_compression: NoCompression
Dec  6 04:43:14 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:43:14.748125) EVENT_LOG_v1 {"time_micros": 1765014194748111, "job": 4, "event": "compaction_finished", "compaction_time_micros": 122216, "compaction_time_cpu_micros": 73133, "output_level": 6, "num_output_files": 1, "total_output_size": 13000182, "num_input_records": 3712, "num_output_records": 3456, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec  6 04:43:14 np0005548916 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000014.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 04:43:14 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014194750354, "job": 4, "event": "table_file_deletion", "file_number": 14}
Dec  6 04:43:14 np0005548916 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000008.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 04:43:14 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014194750495, "job": 4, "event": "table_file_deletion", "file_number": 8}
Dec  6 04:43:14 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:43:14.623290) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 04:43:15 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:43:15 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb00004050 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:15 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 7.9 scrub starts
Dec  6 04:43:15 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 7.9 scrub ok
Dec  6 04:43:15 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e86 e86: 3 total, 3 up, 3 in
Dec  6 04:43:15 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "11"}]: dispatch
Dec  6 04:43:15 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "10"}]: dispatch
Dec  6 04:43:15 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:43:15 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:43:15 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:43:15 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "mgr module enable", "module": "prometheus"}]: dispatch
Dec  6 04:43:15 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:43:15 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:43:15 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:43:15 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:43:15.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:43:15 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:43:15 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb2400a640 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:15 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:43:15 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:43:15 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:43:15.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:43:15 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e87 e87: 3 total, 3 up, 3 in
Dec  6 04:43:15 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 87 pg[10.8( v 51'1027 (0'0,51'1027] local-lis/les=0/0 n=6 ec=58/45 lis/c=85/58 les/c/f=86/59/0 sis=87) [0] r=0 lpr=87 pi=[58,87)/1 luod=0'0 crt=51'1027 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:43:15 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 87 pg[10.8( v 51'1027 (0'0,51'1027] local-lis/les=0/0 n=6 ec=58/45 lis/c=85/58 les/c/f=86/59/0 sis=87) [0] r=0 lpr=87 pi=[58,87)/1 crt=51'1027 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:43:15 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 87 pg[10.18( v 51'1027 (0'0,51'1027] local-lis/les=0/0 n=5 ec=58/45 lis/c=85/58 les/c/f=86/59/0 sis=87) [0] r=0 lpr=87 pi=[58,87)/1 luod=0'0 crt=51'1027 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:43:15 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 87 pg[10.18( v 51'1027 (0'0,51'1027] local-lis/les=0/0 n=5 ec=58/45 lis/c=85/58 les/c/f=86/59/0 sis=87) [0] r=0 lpr=87 pi=[58,87)/1 crt=51'1027 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:43:16 np0005548916 ceph-mon[79770]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #16. Immutable memtables: 0.
Dec  6 04:43:16 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:43:16.010288) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec  6 04:43:16 np0005548916 ceph-mon[79770]: rocksdb: [db/flush_job.cc:856] [default] [JOB 5] Flushing memtable with next log file: 16
Dec  6 04:43:16 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014196010404, "job": 5, "event": "flush_started", "num_memtables": 1, "num_entries": 342, "num_deletes": 253, "total_data_size": 249897, "memory_usage": 257848, "flush_reason": "Manual Compaction"}
Dec  6 04:43:16 np0005548916 ceph-mon[79770]: rocksdb: [db/flush_job.cc:885] [default] [JOB 5] Level-0 flush table #17: started
Dec  6 04:43:16 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014196013436, "cf_name": "default", "job": 5, "event": "table_file_creation", "file_number": 17, "file_size": 165706, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 7363, "largest_seqno": 7700, "table_properties": {"data_size": 163515, "index_size": 355, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 773, "raw_key_size": 4664, "raw_average_key_size": 15, "raw_value_size": 159093, "raw_average_value_size": 526, "num_data_blocks": 16, "num_entries": 302, "num_filter_entries": 302, "num_deletions": 253, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765014195, "oldest_key_time": 1765014195, "file_creation_time": 1765014196, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79d3ff52-4bd2-4fdc-8b55-33dd9c53d8e0", "db_session_id": "1TK25AVRA1WQDS4JHM8T", "orig_file_number": 17, "seqno_to_time_mapping": "N/A"}}
Dec  6 04:43:16 np0005548916 ceph-mon[79770]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 5] Flush lasted 3186 microseconds, and 1332 cpu microseconds.
Dec  6 04:43:16 np0005548916 ceph-mon[79770]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  6 04:43:16 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:43:16.013476) [db/flush_job.cc:967] [default] [JOB 5] Level-0 flush table #17: 165706 bytes OK
Dec  6 04:43:16 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:43:16.013495) [db/memtable_list.cc:519] [default] Level-0 commit table #17 started
Dec  6 04:43:16 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:43:16.014684) [db/memtable_list.cc:722] [default] Level-0 commit table #17: memtable #1 done
Dec  6 04:43:16 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:43:16.014757) EVENT_LOG_v1 {"time_micros": 1765014196014751, "job": 5, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec  6 04:43:16 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:43:16.014774) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec  6 04:43:16 np0005548916 ceph-mon[79770]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 5] Try to delete WAL files size 247490, prev total WAL file size 247490, number of live WAL files 2.
Dec  6 04:43:16 np0005548916 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000013.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 04:43:16 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:43:16.015309) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6B760030' seq:72057594037927935, type:22 .. '6B7600323534' seq:0, type:0; will stop at (end)
Dec  6 04:43:16 np0005548916 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 6] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec  6 04:43:16 np0005548916 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 5 Base level 0, inputs: [17(161KB)], [15(12MB)]
Dec  6 04:43:16 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014196015436, "job": 6, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [17], "files_L6": [15], "score": -1, "input_data_size": 13165888, "oldest_snapshot_seqno": -1}
Dec  6 04:43:16 np0005548916 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 6] Generated table #18: 3236 keys, 12743476 bytes, temperature: kUnknown
Dec  6 04:43:16 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014196096444, "cf_name": "default", "job": 6, "event": "table_file_creation", "file_number": 18, "file_size": 12743476, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12717033, "index_size": 17245, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 8133, "raw_key_size": 83338, "raw_average_key_size": 25, "raw_value_size": 12652523, "raw_average_value_size": 3909, "num_data_blocks": 748, "num_entries": 3236, "num_filter_entries": 3236, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765014012, "oldest_key_time": 0, "file_creation_time": 1765014196, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79d3ff52-4bd2-4fdc-8b55-33dd9c53d8e0", "db_session_id": "1TK25AVRA1WQDS4JHM8T", "orig_file_number": 18, "seqno_to_time_mapping": "N/A"}}
Dec  6 04:43:16 np0005548916 ceph-mon[79770]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  6 04:43:16 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:43:16.096802) [db/compaction/compaction_job.cc:1663] [default] [JOB 6] Compacted 1@0 + 1@6 files to L6 => 12743476 bytes
Dec  6 04:43:16 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:43:16.098983) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 162.3 rd, 157.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.2, 12.4 +0.0 blob) out(12.2 +0.0 blob), read-write-amplify(156.4) write-amplify(76.9) OK, records in: 3758, records dropped: 522 output_compression: NoCompression
Dec  6 04:43:16 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:43:16.099008) EVENT_LOG_v1 {"time_micros": 1765014196098996, "job": 6, "event": "compaction_finished", "compaction_time_micros": 81130, "compaction_time_cpu_micros": 27291, "output_level": 6, "num_output_files": 1, "total_output_size": 12743476, "num_input_records": 3758, "num_output_records": 3236, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec  6 04:43:16 np0005548916 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000017.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 04:43:16 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014196099415, "job": 6, "event": "table_file_deletion", "file_number": 17}
Dec  6 04:43:16 np0005548916 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000015.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 04:43:16 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014196101250, "job": 6, "event": "table_file_deletion", "file_number": 15}
Dec  6 04:43:16 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:43:16.015219) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 04:43:16 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:43:16.101415) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 04:43:16 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:43:16.101424) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 04:43:16 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:43:16.101429) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 04:43:16 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:43:16.101432) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 04:43:16 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:43:16.101433) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 04:43:16 np0005548916 ceph-mgr[80080]: mgr handle_mgr_map respawning because set of enabled modules changed!
Dec  6 04:43:16 np0005548916 ceph-mgr[80080]: mgr respawn  e: '/usr/bin/ceph-mgr'
Dec  6 04:43:16 np0005548916 ceph-mgr[80080]: mgr respawn  0: '/usr/bin/ceph-mgr'
Dec  6 04:43:16 np0005548916 ceph-mgr[80080]: mgr respawn  1: '-n'
Dec  6 04:43:16 np0005548916 ceph-mgr[80080]: mgr respawn  2: 'mgr.compute-1.sauzid'
Dec  6 04:43:16 np0005548916 ceph-mgr[80080]: mgr respawn  3: '-f'
Dec  6 04:43:16 np0005548916 ceph-mgr[80080]: mgr respawn  4: '--setuser'
Dec  6 04:43:16 np0005548916 ceph-mgr[80080]: mgr respawn  5: 'ceph'
Dec  6 04:43:16 np0005548916 ceph-mgr[80080]: mgr respawn  6: '--setgroup'
Dec  6 04:43:16 np0005548916 ceph-mgr[80080]: mgr respawn  7: 'ceph'
Dec  6 04:43:16 np0005548916 ceph-mgr[80080]: mgr respawn  8: '--default-log-to-file=false'
Dec  6 04:43:16 np0005548916 ceph-mgr[80080]: mgr respawn  9: '--default-log-to-journald=true'
Dec  6 04:43:16 np0005548916 ceph-mgr[80080]: mgr respawn  10: '--default-log-to-stderr=false'
Dec  6 04:43:16 np0005548916 ceph-mgr[80080]: mgr respawn respawning with exe /usr/bin/ceph-mgr
Dec  6 04:43:16 np0005548916 ceph-mgr[80080]: mgr respawn  exe_path /proc/self/exe
Dec  6 04:43:16 np0005548916 systemd[1]: session-34.scope: Deactivated successfully.
Dec  6 04:43:16 np0005548916 systemd[1]: session-34.scope: Consumed 26.960s CPU time.
Dec  6 04:43:16 np0005548916 systemd-logind[788]: Session 34 logged out. Waiting for processes to exit.
Dec  6 04:43:16 np0005548916 systemd-logind[788]: Removed session 34.
Dec  6 04:43:16 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: ignoring --setuser ceph since I am not root
Dec  6 04:43:16 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: ignoring --setgroup ceph since I am not root
Dec  6 04:43:16 np0005548916 ceph-mgr[80080]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mgr, pid 2
Dec  6 04:43:16 np0005548916 ceph-mgr[80080]: pidfile_write: ignore empty --pid-file
Dec  6 04:43:16 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 7.1e scrub starts
Dec  6 04:43:16 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'alerts'
Dec  6 04:43:16 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 7.1e scrub ok
Dec  6 04:43:16 np0005548916 ceph-mgr[80080]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Dec  6 04:43:16 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:43:16.475+0000 7f263b68c140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Dec  6 04:43:16 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'balancer'
Dec  6 04:43:16 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "11"}]': finished
Dec  6 04:43:16 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "10"}]': finished
Dec  6 04:43:16 np0005548916 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "mgr module enable", "module": "prometheus"}]': finished
Dec  6 04:43:16 np0005548916 ceph-mgr[80080]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Dec  6 04:43:16 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:43:16.577+0000 7f263b68c140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Dec  6 04:43:16 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'cephadm'
Dec  6 04:43:16 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:43:16 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feaf4004000 fd 14 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:17 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:43:17 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feaf4004000 fd 14 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:17 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e88 e88: 3 total, 3 up, 3 in
Dec  6 04:43:17 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 88 pg[10.8( v 51'1027 (0'0,51'1027] local-lis/les=87/88 n=6 ec=58/45 lis/c=85/58 les/c/f=86/59/0 sis=87) [0] r=0 lpr=87 pi=[58,87)/1 crt=51'1027 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:43:17 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 88 pg[10.18( v 51'1027 (0'0,51'1027] local-lis/les=87/88 n=5 ec=58/45 lis/c=85/58 les/c/f=86/59/0 sis=87) [0] r=0 lpr=87 pi=[58,87)/1 crt=51'1027 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:43:17 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 7.18 scrub starts
Dec  6 04:43:17 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 7.18 scrub ok
Dec  6 04:43:17 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'crash'
Dec  6 04:43:17 np0005548916 ceph-mgr[80080]: mgr[py] Module crash has missing NOTIFY_TYPES member
Dec  6 04:43:17 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'dashboard'
Dec  6 04:43:17 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:43:17.717+0000 7f263b68c140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Dec  6 04:43:17 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:43:17 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:43:17 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:43:17.739 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:43:17 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:43:17 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb200045e0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:17 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  6 04:43:17 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:43:17 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:43:17 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:43:17.939 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:43:18 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e89 e89: 3 total, 3 up, 3 in
Dec  6 04:43:18 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 7.10 scrub starts
Dec  6 04:43:18 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 7.10 scrub ok
Dec  6 04:43:18 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'devicehealth'
Dec  6 04:43:18 np0005548916 ceph-mgr[80080]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Dec  6 04:43:18 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'diskprediction_local'
Dec  6 04:43:18 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:43:18.472+0000 7f263b68c140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Dec  6 04:43:18 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Dec  6 04:43:18 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Dec  6 04:43:18 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]:  from numpy import show_config as show_numpy_config
Dec  6 04:43:18 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:43:18.731+0000 7f263b68c140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Dec  6 04:43:18 np0005548916 ceph-mgr[80080]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Dec  6 04:43:18 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'influx'
Dec  6 04:43:18 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:43:18 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb00004050 fd 14 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:18 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:43:18.819+0000 7f263b68c140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Dec  6 04:43:18 np0005548916 ceph-mgr[80080]: mgr[py] Module influx has missing NOTIFY_TYPES member
Dec  6 04:43:18 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'insights'
Dec  6 04:43:18 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'iostat'
Dec  6 04:43:18 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:43:18.987+0000 7f263b68c140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Dec  6 04:43:18 np0005548916 ceph-mgr[80080]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Dec  6 04:43:18 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'k8sevents'
Dec  6 04:43:19 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:43:19 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb00004050 fd 14 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:19 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 11.0 scrub starts
Dec  6 04:43:19 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 11.0 scrub ok
Dec  6 04:43:19 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'localpool'
Dec  6 04:43:19 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'mds_autoscaler'
Dec  6 04:43:19 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:43:19 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:43:19 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:43:19.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:43:19 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:43:19 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb00004050 fd 14 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:19 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'mirroring'
Dec  6 04:43:19 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:43:19 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:43:19 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:43:19.941 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:43:19 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'nfs'
Dec  6 04:43:20 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:43:20.260+0000 7f263b68c140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Dec  6 04:43:20 np0005548916 ceph-mgr[80080]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Dec  6 04:43:20 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'orchestrator'
Dec  6 04:43:20 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 11.c scrub starts
Dec  6 04:43:20 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 11.c scrub ok
Dec  6 04:43:20 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e90 e90: 3 total, 3 up, 3 in
Dec  6 04:43:20 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:43:20.547+0000 7f263b68c140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Dec  6 04:43:20 np0005548916 ceph-mgr[80080]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Dec  6 04:43:20 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'osd_perf_query'
Dec  6 04:43:20 np0005548916 systemd[1]: session-36.scope: Deactivated successfully.
Dec  6 04:43:20 np0005548916 systemd[1]: session-36.scope: Consumed 9.324s CPU time.
Dec  6 04:43:20 np0005548916 systemd-logind[788]: Session 36 logged out. Waiting for processes to exit.
Dec  6 04:43:20 np0005548916 systemd-logind[788]: Removed session 36.
Dec  6 04:43:20 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:43:20.634+0000 7f263b68c140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Dec  6 04:43:20 np0005548916 ceph-mgr[80080]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Dec  6 04:43:20 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'osd_support'
Dec  6 04:43:20 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:43:20.726+0000 7f263b68c140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Dec  6 04:43:20 np0005548916 ceph-mgr[80080]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Dec  6 04:43:20 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'pg_autoscaler'
Dec  6 04:43:20 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:43:20 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb00004050 fd 14 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:20 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:43:20.812+0000 7f263b68c140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Dec  6 04:43:20 np0005548916 ceph-mgr[80080]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Dec  6 04:43:20 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'progress'
Dec  6 04:43:20 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:43:20.895+0000 7f263b68c140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Dec  6 04:43:20 np0005548916 ceph-mgr[80080]: mgr[py] Module progress has missing NOTIFY_TYPES member
Dec  6 04:43:20 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'prometheus'
Dec  6 04:43:21 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:43:21.276+0000 7f263b68c140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Dec  6 04:43:21 np0005548916 ceph-mgr[80080]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Dec  6 04:43:21 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'rbd_support'
Dec  6 04:43:21 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:43:21 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb00004050 fd 14 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:21 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 11.b scrub starts
Dec  6 04:43:21 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 11.b scrub ok
Dec  6 04:43:21 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:43:21.391+0000 7f263b68c140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Dec  6 04:43:21 np0005548916 ceph-mgr[80080]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Dec  6 04:43:21 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'restful'
Dec  6 04:43:21 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'rgw'
Dec  6 04:43:21 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:43:21 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:43:21 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:43:21.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:43:21 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:43:21 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb00004050 fd 14 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:21 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:43:21.910+0000 7f263b68c140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Dec  6 04:43:21 np0005548916 ceph-mgr[80080]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Dec  6 04:43:21 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'rook'
Dec  6 04:43:21 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:43:21 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:43:21 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:43:21.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:43:22 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 11.9 scrub starts
Dec  6 04:43:22 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 11.9 scrub ok
Dec  6 04:43:22 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:43:22.612+0000 7f263b68c140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Dec  6 04:43:22 np0005548916 ceph-mgr[80080]: mgr[py] Module rook has missing NOTIFY_TYPES member
Dec  6 04:43:22 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'selftest'
Dec  6 04:43:22 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:43:22.690+0000 7f263b68c140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Dec  6 04:43:22 np0005548916 ceph-mgr[80080]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Dec  6 04:43:22 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'snap_schedule'
Dec  6 04:43:22 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:43:22 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb18001080 fd 14 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:22 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:43:22.789+0000 7f263b68c140 -1 mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Dec  6 04:43:22 np0005548916 ceph-mgr[80080]: mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Dec  6 04:43:22 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'stats'
Dec  6 04:43:22 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'status'
Dec  6 04:43:22 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  6 04:43:22 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:43:22.966+0000 7f263b68c140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Dec  6 04:43:22 np0005548916 ceph-mgr[80080]: mgr[py] Module status has missing NOTIFY_TYPES member
Dec  6 04:43:22 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'telegraf'
Dec  6 04:43:23 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:43:23.043+0000 7f263b68c140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Dec  6 04:43:23 np0005548916 ceph-mgr[80080]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Dec  6 04:43:23 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'telemetry'
Dec  6 04:43:23 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:43:23.257+0000 7f263b68c140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Dec  6 04:43:23 np0005548916 ceph-mgr[80080]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Dec  6 04:43:23 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'test_orchestrator'
Dec  6 04:43:23 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:43:23 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb2400a640 fd 14 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:23 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 11.d scrub starts
Dec  6 04:43:23 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 11.d scrub ok
Dec  6 04:43:23 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:43:23.505+0000 7f263b68c140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Dec  6 04:43:23 np0005548916 ceph-mgr[80080]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Dec  6 04:43:23 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'volumes'
Dec  6 04:43:23 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e91 e91: 3 total, 3 up, 3 in
Dec  6 04:43:23 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:43:23 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:43:23 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:43:23.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:43:23 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:43:23.814+0000 7f263b68c140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Dec  6 04:43:23 np0005548916 ceph-mgr[80080]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Dec  6 04:43:23 np0005548916 ceph-mgr[80080]: mgr[py] Loading python module 'zabbix'
Dec  6 04:43:23 np0005548916 ceph-mon[79770]: Active manager daemon compute-0.qhdjwa restarted
Dec  6 04:43:23 np0005548916 ceph-mon[79770]: Activating manager daemon compute-0.qhdjwa
Dec  6 04:43:23 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:43:23 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feafc001090 fd 14 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:23 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:43:23.893+0000 7f263b68c140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Dec  6 04:43:23 np0005548916 ceph-mgr[80080]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Dec  6 04:43:23 np0005548916 ceph-mgr[80080]: [dashboard DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec  6 04:43:23 np0005548916 ceph-mgr[80080]: mgr load Constructed class from module: dashboard
Dec  6 04:43:23 np0005548916 ceph-mgr[80080]: ms_deliver_dispatch: unhandled message 0x557a03a9d860 mon_map magic: 0 from mon.2 v2:192.168.122.101:3300/0
Dec  6 04:43:23 np0005548916 ceph-mgr[80080]: [prometheus DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec  6 04:43:23 np0005548916 ceph-mgr[80080]: mgr load Constructed class from module: prometheus
Dec  6 04:43:23 np0005548916 ceph-mgr[80080]: [dashboard INFO root] server: ssl=no host=192.168.122.101 port=8443
Dec  6 04:43:23 np0005548916 ceph-mgr[80080]: [dashboard INFO root] Configured CherryPy, starting engine...
Dec  6 04:43:23 np0005548916 ceph-mgr[80080]: [dashboard INFO root] Starting engine...
Dec  6 04:43:23 np0005548916 ceph-mgr[80080]: [prometheus INFO root] server_addr: :: server_port: 9283
Dec  6 04:43:23 np0005548916 ceph-mgr[80080]: [prometheus INFO root] Starting engine...
Dec  6 04:43:23 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: [06/Dec/2025:09:43:23] ENGINE Bus STARTING
Dec  6 04:43:23 np0005548916 ceph-mgr[80080]: [prometheus INFO cherrypy.error] [06/Dec/2025:09:43:23] ENGINE Bus STARTING
Dec  6 04:43:23 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: CherryPy Checker:
Dec  6 04:43:23 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: The Application mounted at '' has an empty config.
Dec  6 04:43:23 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 
Dec  6 04:43:23 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:43:23 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:43:23 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:43:23.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:43:24 np0005548916 ceph-mgr[80080]: [dashboard INFO root] Engine started...
Dec  6 04:43:24 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: [06/Dec/2025:09:43:24] ENGINE Serving on http://:::9283
Dec  6 04:43:24 np0005548916 ceph-mgr[80080]: [prometheus INFO cherrypy.error] [06/Dec/2025:09:43:24] ENGINE Serving on http://:::9283
Dec  6 04:43:24 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: [06/Dec/2025:09:43:24] ENGINE Bus STARTED
Dec  6 04:43:24 np0005548916 ceph-mgr[80080]: [prometheus INFO cherrypy.error] [06/Dec/2025:09:43:24] ENGINE Bus STARTED
Dec  6 04:43:24 np0005548916 ceph-mgr[80080]: [prometheus INFO root] Engine started.
Dec  6 04:43:24 np0005548916 systemd-logind[788]: New session 37 of user ceph-admin.
Dec  6 04:43:24 np0005548916 systemd[1]: Started Session 37 of User ceph-admin.
Dec  6 04:43:24 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 11.2 scrub starts
Dec  6 04:43:24 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 11.2 scrub ok
Dec  6 04:43:24 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:43:24 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb00004050 fd 14 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:25 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:43:25 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb18001080 fd 14 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:25 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 11.18 scrub starts
Dec  6 04:43:25 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 11.18 scrub ok
Dec  6 04:43:25 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:43:25 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:43:25 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:43:25.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:43:25 np0005548916 podman[85965]: 2025-12-06 09:43:25.81934101 +0000 UTC m=+0.091981698 container exec 500f8c89b5c281d0ace139835b07ea5bc6259ce3e028d8284d57da97424b55e0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-crash-compute-1, org.label-schema.schema-version=1.0, ceph=True, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  6 04:43:25 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:43:25 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb2400a640 fd 14 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:25 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:43:25 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:43:25 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:43:25.953 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:43:25 np0005548916 podman[85965]: 2025-12-06 09:43:25.959735623 +0000 UTC m=+0.232376311 container exec_died 500f8c89b5c281d0ace139835b07ea5bc6259ce3e028d8284d57da97424b55e0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-crash-compute-1, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  6 04:43:26 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 11.1f scrub starts
Dec  6 04:43:26 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 11.1f scrub ok
Dec  6 04:43:26 np0005548916 podman[86086]: 2025-12-06 09:43:26.456207324 +0000 UTC m=+0.066551439 container exec 6af22af7046e22bedbb2fb280e4d2c530c5b3cac3959f396bf7fe3d14752a7eb (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec  6 04:43:26 np0005548916 podman[86086]: 2025-12-06 09:43:26.468731824 +0000 UTC m=+0.079075949 container exec_died 6af22af7046e22bedbb2fb280e4d2c530c5b3cac3959f396bf7fe3d14752a7eb (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec  6 04:43:26 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:43:26 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feafc002a80 fd 14 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:26 np0005548916 podman[86176]: 2025-12-06 09:43:26.818912761 +0000 UTC m=+0.063045780 container exec 2b1801986393e8e2cbe7b4cdadc22f24012f42b9768a29cb7ee64c55eabe33b4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, org.label-schema.schema-version=1.0, CEPH_REF=squid, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  6 04:43:26 np0005548916 podman[86176]: 2025-12-06 09:43:26.833968275 +0000 UTC m=+0.078101294 container exec_died 2b1801986393e8e2cbe7b4cdadc22f24012f42b9768a29cb7ee64c55eabe33b4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325)
Dec  6 04:43:27 np0005548916 podman[86243]: 2025-12-06 09:43:27.054816752 +0000 UTC m=+0.051950127 container exec 70891cd2190622057f9c45299e27938f7b2105f0244eda3658dedfb18fed50f0 (image=quay.io/ceph/haproxy:2.3, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd)
Dec  6 04:43:27 np0005548916 podman[86243]: 2025-12-06 09:43:27.065588226 +0000 UTC m=+0.062721581 container exec_died 70891cd2190622057f9c45299e27938f7b2105f0244eda3658dedfb18fed50f0 (image=quay.io/ceph/haproxy:2.3, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd)
Dec  6 04:43:27 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:43:27 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb00004050 fd 14 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:27 np0005548916 podman[86309]: 2025-12-06 09:43:27.30043569 +0000 UTC m=+0.053521297 container exec c8ec7212805c01399bc295ce2c5e69b11fbde393e887859b5ab336e81cd6d1f1 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-1-uzbtlt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, summary=Provides keepalived on RHEL 9 for Ceph., vendor=Red Hat, Inc., name=keepalived, version=2.2.4, com.redhat.component=keepalived-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1793, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, description=keepalived for Ceph, io.buildah.version=1.28.2, io.k8s.display-name=Keepalived on RHEL 9, architecture=x86_64, distribution-scope=public, io.openshift.tags=Ceph keepalived, build-date=2023-02-22T09:23:20, vcs-type=git)
Dec  6 04:43:27 np0005548916 podman[86309]: 2025-12-06 09:43:27.318580353 +0000 UTC m=+0.071665930 container exec_died c8ec7212805c01399bc295ce2c5e69b11fbde393e887859b5ab336e81cd6d1f1 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-1-uzbtlt, io.buildah.version=1.28.2, io.k8s.display-name=Keepalived on RHEL 9, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=keepalived, version=2.2.4, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=Ceph keepalived, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., build-date=2023-02-22T09:23:20, com.redhat.component=keepalived-container, vcs-type=git, architecture=x86_64, release=1793, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.openshift.expose-services=, summary=Provides keepalived on RHEL 9 for Ceph., description=keepalived for Ceph)
Dec  6 04:43:27 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 11.10 deep-scrub starts
Dec  6 04:43:27 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 11.10 deep-scrub ok
Dec  6 04:43:27 np0005548916 ceph-mon[79770]: Manager daemon compute-0.qhdjwa is now available
Dec  6 04:43:27 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:43:27 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.qhdjwa/mirror_snapshot_schedule"}]: dispatch
Dec  6 04:43:27 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.qhdjwa/trash_purge_schedule"}]: dispatch
Dec  6 04:43:27 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:43:27 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:43:27 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:43:27.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:43:27 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:43:27 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb18002470 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:27 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  6 04:43:27 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:43:27 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:43:27 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:43:27.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:43:28 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 11.11 scrub starts
Dec  6 04:43:28 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 11.11 scrub ok
Dec  6 04:43:28 np0005548916 ceph-mon[79770]: [06/Dec/2025:09:43:25] ENGINE Bus STARTING
Dec  6 04:43:28 np0005548916 ceph-mon[79770]: [06/Dec/2025:09:43:25] ENGINE Serving on https://192.168.122.100:7150
Dec  6 04:43:28 np0005548916 ceph-mon[79770]: [06/Dec/2025:09:43:25] ENGINE Client ('192.168.122.100', 44988) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Dec  6 04:43:28 np0005548916 ceph-mon[79770]: [06/Dec/2025:09:43:25] ENGINE Serving on http://192.168.122.100:8765
Dec  6 04:43:28 np0005548916 ceph-mon[79770]: [06/Dec/2025:09:43:25] ENGINE Bus STARTED
Dec  6 04:43:28 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:43:28 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:43:28 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "12"}]: dispatch
Dec  6 04:43:28 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "11"}]: dispatch
Dec  6 04:43:28 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e92 e92: 3 total, 3 up, 3 in
Dec  6 04:43:28 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 92 pg[10.a( v 51'1027 (0'0,51'1027] local-lis/les=68/69 n=9 ec=58/45 lis/c=68/68 les/c/f=69/69/0 sis=92 pruub=14.626732826s) [1] r=-1 lpr=92 pi=[68,92)/1 crt=51'1027 mlcod 0'0 active pruub 247.411544800s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:43:28 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 92 pg[10.a( v 51'1027 (0'0,51'1027] local-lis/les=68/69 n=9 ec=58/45 lis/c=68/68 les/c/f=69/69/0 sis=92 pruub=14.626684189s) [1] r=-1 lpr=92 pi=[68,92)/1 crt=51'1027 mlcod 0'0 unknown NOTIFY pruub 247.411544800s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:43:28 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 92 pg[10.1a( v 51'1027 (0'0,51'1027] local-lis/les=67/68 n=4 ec=58/45 lis/c=67/67 les/c/f=68/68/0 sis=92 pruub=13.612200737s) [1] r=-1 lpr=92 pi=[67,92)/1 crt=51'1027 mlcod 0'0 active pruub 246.398071289s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:43:28 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 92 pg[10.1a( v 51'1027 (0'0,51'1027] local-lis/les=67/68 n=4 ec=58/45 lis/c=67/67 les/c/f=68/68/0 sis=92 pruub=13.612160683s) [1] r=-1 lpr=92 pi=[67,92)/1 crt=51'1027 mlcod 0'0 unknown NOTIFY pruub 246.398071289s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:43:28 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 92 pg[6.b( empty local-lis/les=0/0 n=0 ec=54/21 lis/c=64/64 les/c/f=65/65/0 sis=92) [0] r=0 lpr=92 pi=[64,92)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:43:28 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:43:28 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb2400a640 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:29 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:43:29 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb2400a640 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:29 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 11.6 scrub starts
Dec  6 04:43:29 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 11.6 scrub ok
Dec  6 04:43:29 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "12"}]': finished
Dec  6 04:43:29 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "11"}]': finished
Dec  6 04:43:29 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:43:29 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:43:29 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Dec  6 04:43:29 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]': finished
Dec  6 04:43:29 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:43:29 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:43:29 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e93 e93: 3 total, 3 up, 3 in
Dec  6 04:43:29 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 93 pg[10.a( v 51'1027 (0'0,51'1027] local-lis/les=68/69 n=9 ec=58/45 lis/c=68/68 les/c/f=69/69/0 sis=93) [1]/[0] r=0 lpr=93 pi=[68,93)/1 crt=51'1027 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:43:29 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 93 pg[10.a( v 51'1027 (0'0,51'1027] local-lis/les=68/69 n=9 ec=58/45 lis/c=68/68 les/c/f=69/69/0 sis=93) [1]/[0] r=0 lpr=93 pi=[68,93)/1 crt=51'1027 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Dec  6 04:43:29 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 93 pg[10.1a( v 51'1027 (0'0,51'1027] local-lis/les=67/68 n=4 ec=58/45 lis/c=67/67 les/c/f=68/68/0 sis=93) [1]/[0] r=0 lpr=93 pi=[67,93)/1 crt=51'1027 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:43:29 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 93 pg[10.1a( v 51'1027 (0'0,51'1027] local-lis/les=67/68 n=4 ec=58/45 lis/c=67/67 les/c/f=68/68/0 sis=93) [1]/[0] r=0 lpr=93 pi=[67,93)/1 crt=51'1027 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Dec  6 04:43:29 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 93 pg[6.b( v 50'39 lc 0'0 (0'0,50'39] local-lis/les=92/93 n=1 ec=54/21 lis/c=64/64 les/c/f=65/65/0 sis=92) [0] r=0 lpr=92 pi=[64,92)/1 crt=50'39 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:43:29 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:43:29 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:43:29 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:43:29.763 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:43:29 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:43:29 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb00004050 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:29 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:43:29 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:43:29 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:43:29.961 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:43:30 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 11.15 scrub starts
Dec  6 04:43:30 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 11.15 scrub ok
Dec  6 04:43:30 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:43:30 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb200045e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:30 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e94 e94: 3 total, 3 up, 3 in
Dec  6 04:43:30 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "13"}]: dispatch
Dec  6 04:43:30 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "12"}]: dispatch
Dec  6 04:43:30 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:43:30 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:43:30 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Dec  6 04:43:30 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:43:30 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:43:30 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 94 pg[10.a( v 51'1027 (0'0,51'1027] local-lis/les=93/94 n=9 ec=58/45 lis/c=68/68 les/c/f=69/69/0 sis=93) [1]/[0] async=[1] r=0 lpr=93 pi=[68,93)/1 crt=51'1027 mlcod 0'0 active+remapped mbc={255={(0+1)=9}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:43:30 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 94 pg[10.1a( v 51'1027 (0'0,51'1027] local-lis/les=93/94 n=4 ec=58/45 lis/c=67/67 les/c/f=68/68/0 sis=93) [1]/[0] async=[1] r=0 lpr=93 pi=[67,93)/1 crt=51'1027 mlcod 0'0 active+remapped mbc={255={(0+1)=4}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:43:31 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:43:31 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb2400a640 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:31 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 12.10 scrub starts
Dec  6 04:43:31 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 12.10 scrub ok
Dec  6 04:43:31 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:43:31 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:43:31 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:43:31.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:43:31 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "13"}]': finished
Dec  6 04:43:31 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "12"}]': finished
Dec  6 04:43:31 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "14"}]: dispatch
Dec  6 04:43:31 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "13"}]: dispatch
Dec  6 04:43:31 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e95 e95: 3 total, 3 up, 3 in
Dec  6 04:43:31 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 95 pg[10.a( v 51'1027 (0'0,51'1027] local-lis/les=93/94 n=9 ec=58/45 lis/c=93/68 les/c/f=94/69/0 sis=95 pruub=15.010676384s) [1] async=[1] r=-1 lpr=95 pi=[68,95)/1 crt=51'1027 mlcod 51'1027 active pruub 250.848419189s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:43:31 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 95 pg[10.a( v 51'1027 (0'0,51'1027] local-lis/les=93/94 n=9 ec=58/45 lis/c=93/68 les/c/f=94/69/0 sis=95 pruub=15.010571480s) [1] r=-1 lpr=95 pi=[68,95)/1 crt=51'1027 mlcod 0'0 unknown NOTIFY pruub 250.848419189s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:43:31 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 95 pg[10.1a( v 51'1027 (0'0,51'1027] local-lis/les=93/94 n=4 ec=58/45 lis/c=93/67 les/c/f=94/68/0 sis=95 pruub=15.009842873s) [1] async=[1] r=-1 lpr=95 pi=[67,95)/1 crt=51'1027 mlcod 51'1027 active pruub 250.848556519s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:43:31 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 95 pg[10.1a( v 51'1027 (0'0,51'1027] local-lis/les=93/94 n=4 ec=58/45 lis/c=93/67 les/c/f=94/68/0 sis=95 pruub=15.009781837s) [1] r=-1 lpr=95 pi=[67,95)/1 crt=51'1027 mlcod 0'0 unknown NOTIFY pruub 250.848556519s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:43:31 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:43:31 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb2400a640 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:31 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:43:31 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:43:31 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:43:31.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:43:32 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 12.c scrub starts
Dec  6 04:43:32 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 12.c scrub ok
Dec  6 04:43:32 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:43:32 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb00004050 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:32 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e95 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  6 04:43:33 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:43:33 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb200045e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:33 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 12.a scrub starts
Dec  6 04:43:33 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 12.a scrub ok
Dec  6 04:43:33 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:43:33 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:43:33 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:43:33.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:43:33 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:43:33 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb200045e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:33 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:43:33 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:43:33 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:43:33.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:43:34 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 12.b scrub starts
Dec  6 04:43:34 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 12.b scrub ok
Dec  6 04:43:34 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:43:34 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb200045e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:35 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e96 e96: 3 total, 3 up, 3 in
Dec  6 04:43:35 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:43:35 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb00004050 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:35 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "14"}]': finished
Dec  6 04:43:35 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "13"}]': finished
Dec  6 04:43:35 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:43:35 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:43:35 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Dec  6 04:43:35 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  6 04:43:35 np0005548916 ceph-mon[79770]: Updating compute-0:/etc/ceph/ceph.conf
Dec  6 04:43:35 np0005548916 ceph-mon[79770]: Updating compute-1:/etc/ceph/ceph.conf
Dec  6 04:43:35 np0005548916 ceph-mon[79770]: Updating compute-2:/etc/ceph/ceph.conf
Dec  6 04:43:35 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 12.e scrub starts
Dec  6 04:43:35 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 12.e scrub ok
Dec  6 04:43:35 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:43:35 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:43:35 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:43:35.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:43:35 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:43:35 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb2400a640 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:35 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:43:35 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:43:35 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:43:35.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:43:36 np0005548916 systemd-logind[788]: New session 38 of user zuul.
Dec  6 04:43:36 np0005548916 systemd[1]: Started Session 38 of User zuul.
Dec  6 04:43:36 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 12.19 scrub starts
Dec  6 04:43:36 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 12.19 scrub ok
Dec  6 04:43:36 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:43:36 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feafc001f70 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:36 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e97 e97: 3 total, 3 up, 3 in
Dec  6 04:43:36 np0005548916 ceph-mon[79770]: Updating compute-2:/var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/config/ceph.conf
Dec  6 04:43:36 np0005548916 ceph-mon[79770]: Updating compute-1:/var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/config/ceph.conf
Dec  6 04:43:36 np0005548916 ceph-mon[79770]: Updating compute-0:/var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/config/ceph.conf
Dec  6 04:43:36 np0005548916 ceph-mon[79770]: Updating compute-2:/etc/ceph/ceph.client.admin.keyring
Dec  6 04:43:36 np0005548916 ceph-mon[79770]: Updating compute-1:/etc/ceph/ceph.client.admin.keyring
Dec  6 04:43:36 np0005548916 ceph-mon[79770]: Updating compute-0:/etc/ceph/ceph.client.admin.keyring
Dec  6 04:43:36 np0005548916 ceph-mon[79770]: Updating compute-2:/var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/config/ceph.client.admin.keyring
Dec  6 04:43:36 np0005548916 ceph-mon[79770]: Updating compute-1:/var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/config/ceph.client.admin.keyring
Dec  6 04:43:36 np0005548916 ceph-mon[79770]: Updating compute-0:/var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/config/ceph.client.admin.keyring
Dec  6 04:43:36 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:43:36 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:43:36 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:43:36 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:43:36 np0005548916 python3.9[87541]: ansible-ansible.legacy.ping Invoked with data=pong
Dec  6 04:43:37 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:43:37 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb200045e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:37 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 12.8 scrub starts
Dec  6 04:43:37 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 12.8 scrub ok
Dec  6 04:43:37 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:43:37 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:43:37 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:43:37.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:43:37 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e98 e98: 3 total, 3 up, 3 in
Dec  6 04:43:37 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:43:37 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:43:37 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:43:37 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:43:37 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  6 04:43:37 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:43:37 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb00004050 fd 47 proxy ignored for local
Dec  6 04:43:37 np0005548916 kernel: ganesha.nfsd[84964]: segfault at 50 ip 00007febd392032e sp 00007feb8d7f9210 error 4 in libntirpc.so.5.8[7febd3905000+2c000] likely on CPU 2 (core 0, socket 2)
Dec  6 04:43:37 np0005548916 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Dec  6 04:43:37 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e98 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  6 04:43:37 np0005548916 systemd[1]: Created slice Slice /system/systemd-coredump.
Dec  6 04:43:37 np0005548916 systemd[1]: Started Process Core Dump (PID 87688/UID 0).
Dec  6 04:43:37 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:43:37 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:43:37 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:43:37.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:43:38 np0005548916 python3.9[87717]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  6 04:43:38 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 12.6 deep-scrub starts
Dec  6 04:43:38 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 12.6 deep-scrub ok
Dec  6 04:43:38 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e99 e99: 3 total, 3 up, 3 in
Dec  6 04:43:39 np0005548916 python3.9[87874]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:43:39 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 12.12 scrub starts
Dec  6 04:43:39 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 12.12 scrub ok
Dec  6 04:43:39 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:43:39 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:43:39 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:43:39.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:43:39 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:43:39 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:43:39 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:43:39.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:43:40 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 12.1c scrub starts
Dec  6 04:43:40 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 12.1c scrub ok
Dec  6 04:43:40 np0005548916 systemd-coredump[87690]: Process 84526 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 51:#012#0  0x00007febd392032e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Dec  6 04:43:40 np0005548916 python3.9[88028]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  6 04:43:40 np0005548916 systemd[1]: systemd-coredump@0-87688-0.service: Deactivated successfully.
Dec  6 04:43:40 np0005548916 systemd[1]: systemd-coredump@0-87688-0.service: Consumed 2.890s CPU time.
Dec  6 04:43:40 np0005548916 podman[88043]: 2025-12-06 09:43:40.963694823 +0000 UTC m=+0.029622557 container died 2b1801986393e8e2cbe7b4cdadc22f24012f42b9768a29cb7ee64c55eabe33b4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec  6 04:43:40 np0005548916 systemd[1]: var-lib-containers-storage-overlay-fd7e7e9e4ddac3e74b3b7bc6b20dd5bb2fcc490030f679e68f53a0a8ada38ac6-merged.mount: Deactivated successfully.
Dec  6 04:43:40 np0005548916 podman[88043]: 2025-12-06 09:43:40.998903992 +0000 UTC m=+0.064831326 container remove 2b1801986393e8e2cbe7b4cdadc22f24012f42b9768a29cb7ee64c55eabe33b4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.license=GPLv2, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  6 04:43:41 np0005548916 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.0.0.compute-1.djsnbu.service: Main process exited, code=exited, status=139/n/a
Dec  6 04:43:41 np0005548916 systemd[81504]: Starting Mark boot as successful...
Dec  6 04:43:41 np0005548916 systemd[81504]: Finished Mark boot as successful.
Dec  6 04:43:41 np0005548916 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.0.0.compute-1.djsnbu.service: Failed with result 'exit-code'.
Dec  6 04:43:41 np0005548916 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.0.0.compute-1.djsnbu.service: Consumed 2.627s CPU time.
Dec  6 04:43:41 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 10.12 scrub starts
Dec  6 04:43:41 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 10.12 scrub ok
Dec  6 04:43:41 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:43:41 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:43:41 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:43:41.786 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:43:41 np0005548916 python3.9[88229]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:43:41 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:43:41 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:43:41 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:43:41.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:43:42 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 10.d deep-scrub starts
Dec  6 04:43:42 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 10.d deep-scrub ok
Dec  6 04:43:42 np0005548916 python3.9[88382]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:43:42 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e99 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  6 04:43:43 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 10.8 deep-scrub starts
Dec  6 04:43:43 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 10.8 deep-scrub ok
Dec  6 04:43:43 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:43:43 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:43:43 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:43:43.790 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:43:43 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:43:43 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:43:43 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:43:43.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:43:44 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:43:44 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 6.e scrub starts
Dec  6 04:43:44 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 6.e scrub ok
Dec  6 04:43:44 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e100 e100: 3 total, 3 up, 3 in
Dec  6 04:43:44 np0005548916 python3.9[88566]: ansible-ansible.builtin.service_facts Invoked
Dec  6 04:43:44 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 100 pg[6.e( v 50'39 (0'0,50'39] local-lis/les=76/77 n=1 ec=54/21 lis/c=76/76 les/c/f=77/77/0 sis=100 pruub=8.985174179s) [1] r=-1 lpr=100 pi=[76,100)/1 crt=50'39 mlcod 50'39 active pruub 257.989654541s@ mbc={255={}}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:43:44 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 100 pg[6.e( v 50'39 (0'0,50'39] local-lis/les=76/77 n=1 ec=54/21 lis/c=76/76 les/c/f=77/77/0 sis=100 pruub=8.984956741s) [1] r=-1 lpr=100 pi=[76,100)/1 crt=50'39 mlcod 0'0 unknown NOTIFY pruub 257.989654541s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:43:44 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 100 pg[10.d( v 51'1027 (0'0,51'1027] local-lis/les=79/80 n=8 ec=58/45 lis/c=79/79 les/c/f=80/80/0 sis=100 pruub=12.035808563s) [1] r=-1 lpr=100 pi=[79,100)/1 crt=51'1027 mlcod 0'0 active pruub 261.040283203s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:43:44 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 100 pg[10.d( v 51'1027 (0'0,51'1027] local-lis/les=79/80 n=8 ec=58/45 lis/c=79/79 les/c/f=80/80/0 sis=100 pruub=12.035377502s) [1] r=-1 lpr=100 pi=[79,100)/1 crt=51'1027 mlcod 0'0 unknown NOTIFY pruub 261.040283203s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:43:44 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 100 pg[10.1d( v 51'1027 (0'0,51'1027] local-lis/les=79/80 n=5 ec=58/45 lis/c=79/79 les/c/f=80/80/0 sis=100 pruub=12.038669586s) [1] r=-1 lpr=100 pi=[79,100)/1 crt=51'1027 mlcod 0'0 active pruub 261.043975830s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:43:44 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 100 pg[10.1d( v 51'1027 (0'0,51'1027] local-lis/les=79/80 n=5 ec=58/45 lis/c=79/79 les/c/f=80/80/0 sis=100 pruub=12.038622856s) [1] r=-1 lpr=100 pi=[79,100)/1 crt=51'1027 mlcod 0'0 unknown NOTIFY pruub 261.043975830s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:43:45 np0005548916 network[88600]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec  6 04:43:45 np0005548916 network[88601]: 'network-scripts' will be removed from distribution in near future.
Dec  6 04:43:45 np0005548916 network[88602]: It is advised to switch to 'NetworkManager' instead for network management.
Dec  6 04:43:45 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/094345 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  6 04:43:45 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 10.5 scrub starts
Dec  6 04:43:45 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 10.5 scrub ok
Dec  6 04:43:45 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:43:45 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "15"}]: dispatch
Dec  6 04:43:45 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "14"}]: dispatch
Dec  6 04:43:45 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:43:45 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "15"}]': finished
Dec  6 04:43:45 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "14"}]': finished
Dec  6 04:43:45 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Dec  6 04:43:45 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:43:45 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:43:45 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:43:45.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:43:45 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:43:45 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:43:45 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:43:45.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:43:46 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 10.18 scrub starts
Dec  6 04:43:46 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 10.18 scrub ok
Dec  6 04:43:47 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e101 e101: 3 total, 3 up, 3 in
Dec  6 04:43:47 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 101 pg[10.d( v 51'1027 (0'0,51'1027] local-lis/les=79/80 n=8 ec=58/45 lis/c=79/79 les/c/f=80/80/0 sis=101) [1]/[0] r=0 lpr=101 pi=[79,101)/1 crt=51'1027 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:43:47 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 101 pg[10.d( v 51'1027 (0'0,51'1027] local-lis/les=79/80 n=8 ec=58/45 lis/c=79/79 les/c/f=80/80/0 sis=101) [1]/[0] r=0 lpr=101 pi=[79,101)/1 crt=51'1027 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Dec  6 04:43:47 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 101 pg[10.1d( v 51'1027 (0'0,51'1027] local-lis/les=79/80 n=5 ec=58/45 lis/c=79/79 les/c/f=80/80/0 sis=101) [1]/[0] r=0 lpr=101 pi=[79,101)/1 crt=51'1027 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:43:47 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 101 pg[10.1d( v 51'1027 (0'0,51'1027] local-lis/les=79/80 n=5 ec=58/45 lis/c=79/79 les/c/f=80/80/0 sis=101) [1]/[0] r=0 lpr=101 pi=[79,101)/1 crt=51'1027 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Dec  6 04:43:47 np0005548916 ceph-mon[79770]: Reconfiguring mon.compute-0 (monmap changed)...
Dec  6 04:43:47 np0005548916 ceph-mon[79770]: Reconfiguring daemon mon.compute-0 on compute-0
Dec  6 04:43:47 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "16"}]: dispatch
Dec  6 04:43:47 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "15"}]: dispatch
Dec  6 04:43:47 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 6.b scrub starts
Dec  6 04:43:47 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 6.b scrub ok
Dec  6 04:43:47 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:43:47 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:43:47 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:43:47.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:43:47 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e101 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  6 04:43:47 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:43:47 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:43:47 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:43:47.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:43:48 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 10.15 scrub starts
Dec  6 04:43:48 np0005548916 ceph-osd[77465]: log_channel(cluster) log [DBG] : 10.15 scrub ok
Dec  6 04:43:49 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:43:49 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:43:49 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:43:49.801 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:43:50 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:43:50 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:43:50 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:43:50.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:43:50 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e102 e102: 3 total, 3 up, 3 in
Dec  6 04:43:50 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 102 pg[6.f( empty local-lis/les=0/0 n=0 ec=54/21 lis/c=64/64 les/c/f=65/65/0 sis=102) [0] r=0 lpr=102 pi=[64,102)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:43:50 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 102 pg[10.d( v 51'1027 (0'0,51'1027] local-lis/les=101/102 n=8 ec=58/45 lis/c=79/79 les/c/f=80/80/0 sis=101) [1]/[0] async=[1] r=0 lpr=101 pi=[79,101)/1 crt=51'1027 mlcod 0'0 active+remapped mbc={255={(0+1)=8}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:43:50 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 102 pg[10.1d( v 51'1027 (0'0,51'1027] local-lis/les=101/102 n=5 ec=58/45 lis/c=79/79 les/c/f=80/80/0 sis=101) [1]/[0] async=[1] r=0 lpr=101 pi=[79,101)/1 crt=51'1027 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:43:51 np0005548916 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.0.0.compute-1.djsnbu.service: Scheduled restart job, restart counter is at 1.
Dec  6 04:43:51 np0005548916 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.djsnbu for 5ecd3f74-dade-5fc4-92ce-8950ae424258.
Dec  6 04:43:51 np0005548916 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.0.0.compute-1.djsnbu.service: Consumed 2.627s CPU time.
Dec  6 04:43:51 np0005548916 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.djsnbu for 5ecd3f74-dade-5fc4-92ce-8950ae424258...
Dec  6 04:43:51 np0005548916 python3.9[88874]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:43:51 np0005548916 podman[88914]: 2025-12-06 09:43:51.461902961 +0000 UTC m=+0.062415184 container create 90da3924cacc3efa403ea45549b6824092497428ac34121bd882ee64a78e789c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  6 04:43:51 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f61a3b3a24536c7c4e747ec6bc4b7c9a3e3b3c6a417aa3b9f3cf3ab295997eb4/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Dec  6 04:43:51 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f61a3b3a24536c7c4e747ec6bc4b7c9a3e3b3c6a417aa3b9f3cf3ab295997eb4/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  6 04:43:51 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f61a3b3a24536c7c4e747ec6bc4b7c9a3e3b3c6a417aa3b9f3cf3ab295997eb4/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Dec  6 04:43:51 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f61a3b3a24536c7c4e747ec6bc4b7c9a3e3b3c6a417aa3b9f3cf3ab295997eb4/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.djsnbu-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Dec  6 04:43:51 np0005548916 podman[88914]: 2025-12-06 09:43:51.427743619 +0000 UTC m=+0.028255892 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  6 04:43:51 np0005548916 podman[88914]: 2025-12-06 09:43:51.530290546 +0000 UTC m=+0.130802749 container init 90da3924cacc3efa403ea45549b6824092497428ac34121bd882ee64a78e789c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, org.label-schema.license=GPLv2)
Dec  6 04:43:51 np0005548916 podman[88914]: 2025-12-06 09:43:51.544069588 +0000 UTC m=+0.144581771 container start 90da3924cacc3efa403ea45549b6824092497428ac34121bd882ee64a78e789c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid)
Dec  6 04:43:51 np0005548916 bash[88914]: 90da3924cacc3efa403ea45549b6824092497428ac34121bd882ee64a78e789c
Dec  6 04:43:51 np0005548916 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.djsnbu for 5ecd3f74-dade-5fc4-92ce-8950ae424258.
Dec  6 04:43:51 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:43:51 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Dec  6 04:43:51 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:43:51 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Dec  6 04:43:51 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:43:51 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:43:51 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:43:51.804 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:43:52 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:43:52 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:43:52 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:43:52.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:43:52 np0005548916 python3.9[89098]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  6 04:43:52 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:43:52 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Dec  6 04:43:52 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:43:52 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Dec  6 04:43:52 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:43:52 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Dec  6 04:43:52 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:43:52 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Dec  6 04:43:52 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:43:52 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Dec  6 04:43:52 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:43:52 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 04:43:52 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e102 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  6 04:43:53 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e103 e103: 3 total, 3 up, 3 in
Dec  6 04:43:53 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "16"}]': finished
Dec  6 04:43:53 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "15"}]': finished
Dec  6 04:43:53 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:43:53 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:43:53 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.compute-0.qhdjwa", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch
Dec  6 04:43:53 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 103 pg[6.f( v 50'39 lc 48'1 (0'0,50'39] local-lis/les=102/103 n=3 ec=54/21 lis/c=64/64 les/c/f=65/65/0 sis=102) [0] r=0 lpr=102 pi=[64,102)/1 crt=50'39 lcod 0'0 mlcod 0'0 active+degraded m=3 mbc={255={(0+1)=3}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:43:53 np0005548916 python3.9[89275]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  6 04:43:53 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:43:53 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:43:53 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:43:53.809 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:43:53 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e104 e104: 3 total, 3 up, 3 in
Dec  6 04:43:53 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 104 pg[10.1d( v 51'1027 (0'0,51'1027] local-lis/les=101/102 n=5 ec=58/45 lis/c=101/79 les/c/f=102/80/0 sis=104 pruub=13.127288818s) [1] async=[1] r=-1 lpr=104 pi=[79,104)/1 crt=51'1027 mlcod 51'1027 active pruub 270.998352051s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:43:53 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 104 pg[10.1d( v 51'1027 (0'0,51'1027] local-lis/les=101/102 n=5 ec=58/45 lis/c=101/79 les/c/f=102/80/0 sis=104 pruub=13.127108574s) [1] r=-1 lpr=104 pi=[79,104)/1 crt=51'1027 mlcod 0'0 unknown NOTIFY pruub 270.998352051s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:43:53 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 104 pg[10.d( v 51'1027 (0'0,51'1027] local-lis/les=101/102 n=8 ec=58/45 lis/c=101/79 les/c/f=102/80/0 sis=104 pruub=13.113863945s) [1] async=[1] r=-1 lpr=104 pi=[79,104)/1 crt=51'1027 mlcod 51'1027 active pruub 270.985778809s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:43:53 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 104 pg[10.d( v 51'1027 (0'0,51'1027] local-lis/les=101/102 n=8 ec=58/45 lis/c=101/79 les/c/f=102/80/0 sis=104 pruub=13.113585472s) [1] r=-1 lpr=104 pi=[79,104)/1 crt=51'1027 mlcod 0'0 unknown NOTIFY pruub 270.985778809s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:43:54 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:43:54 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:43:54 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:43:54.006 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:43:54 np0005548916 ceph-mon[79770]: Reconfiguring mgr.compute-0.qhdjwa (monmap changed)...
Dec  6 04:43:54 np0005548916 ceph-mon[79770]: Reconfiguring daemon mgr.compute-0.qhdjwa on compute-0
Dec  6 04:43:54 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:43:54 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:43:54 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.compute-0", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch
Dec  6 04:43:54 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e105 e105: 3 total, 3 up, 3 in
Dec  6 04:43:55 np0005548916 ceph-mon[79770]: Reconfiguring crash.compute-0 (monmap changed)...
Dec  6 04:43:55 np0005548916 ceph-mon[79770]: Reconfiguring daemon crash.compute-0 on compute-0
Dec  6 04:43:55 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:43:55 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:43:55 np0005548916 ceph-mon[79770]: Reconfiguring osd.1 (monmap changed)...
Dec  6 04:43:55 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "osd.1"}]: dispatch
Dec  6 04:43:55 np0005548916 ceph-mon[79770]: Reconfiguring daemon osd.1 on compute-0
Dec  6 04:43:55 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:43:55 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:43:55 np0005548916 python3.9[89434]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  6 04:43:55 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:43:55 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:43:55 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:43:55.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:43:56 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:43:56 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:43:56 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:43:56.009 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:43:56 np0005548916 ceph-mon[79770]: Reconfiguring alertmanager.compute-0 (dependencies changed)...
Dec  6 04:43:56 np0005548916 ceph-mon[79770]: Reconfiguring daemon alertmanager.compute-0 on compute-0
Dec  6 04:43:56 np0005548916 python3.9[89518]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  6 04:43:57 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:43:57 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:43:57 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:43:57 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:43:57 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:43:57.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:43:57 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e105 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  6 04:43:58 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:43:58 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:43:58 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:43:58.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:43:58 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e106 e106: 3 total, 3 up, 3 in
Dec  6 04:43:58 np0005548916 ceph-mon[79770]: Reconfiguring grafana.compute-0 (dependencies changed)...
Dec  6 04:43:58 np0005548916 ceph-mon[79770]: Reconfiguring daemon grafana.compute-0 on compute-0
Dec  6 04:43:58 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "16"}]: dispatch
Dec  6 04:43:58 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:43:58 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 04:43:58 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:43:58 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 04:43:58 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e107 e107: 3 total, 3 up, 3 in
Dec  6 04:43:59 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "16"}]': finished
Dec  6 04:43:59 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:43:59 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:43:59 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.compute-1", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch
Dec  6 04:43:59 np0005548916 podman[89650]: 2025-12-06 09:43:59.725850537 +0000 UTC m=+0.051339782 container create 37fe98652af2394c8044d3aff43021e8a09b95485a8991619a9fb9d6bc2c043d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=frosty_edison, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  6 04:43:59 np0005548916 systemd[1]: Started libpod-conmon-37fe98652af2394c8044d3aff43021e8a09b95485a8991619a9fb9d6bc2c043d.scope.
Dec  6 04:43:59 np0005548916 podman[89650]: 2025-12-06 09:43:59.706306278 +0000 UTC m=+0.031795553 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  6 04:43:59 np0005548916 systemd[1]: Started libcrun container.
Dec  6 04:43:59 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:43:59 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:43:59 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:43:59.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:43:59 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e108 e108: 3 total, 3 up, 3 in
Dec  6 04:43:59 np0005548916 podman[89650]: 2025-12-06 09:43:59.846443174 +0000 UTC m=+0.171932519 container init 37fe98652af2394c8044d3aff43021e8a09b95485a8991619a9fb9d6bc2c043d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=frosty_edison, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325)
Dec  6 04:43:59 np0005548916 podman[89650]: 2025-12-06 09:43:59.861384516 +0000 UTC m=+0.186873811 container start 37fe98652af2394c8044d3aff43021e8a09b95485a8991619a9fb9d6bc2c043d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=frosty_edison, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2)
Dec  6 04:43:59 np0005548916 podman[89650]: 2025-12-06 09:43:59.866296031 +0000 UTC m=+0.191785326 container attach 37fe98652af2394c8044d3aff43021e8a09b95485a8991619a9fb9d6bc2c043d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=frosty_edison, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  6 04:43:59 np0005548916 frosty_edison[89667]: 167 167
Dec  6 04:43:59 np0005548916 systemd[1]: libpod-37fe98652af2394c8044d3aff43021e8a09b95485a8991619a9fb9d6bc2c043d.scope: Deactivated successfully.
Dec  6 04:43:59 np0005548916 conmon[89667]: conmon 37fe98652af2394c8044 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-37fe98652af2394c8044d3aff43021e8a09b95485a8991619a9fb9d6bc2c043d.scope/container/memory.events
Dec  6 04:43:59 np0005548916 podman[89650]: 2025-12-06 09:43:59.871860543 +0000 UTC m=+0.197349798 container died 37fe98652af2394c8044d3aff43021e8a09b95485a8991619a9fb9d6bc2c043d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=frosty_edison, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.license=GPLv2, io.buildah.version=1.40.1)
Dec  6 04:43:59 np0005548916 systemd[1]: var-lib-containers-storage-overlay-a369bfac63ba24ce3a211111e3ff78512d8d345c1c8cc876d5d4f7d91ce22fb4-merged.mount: Deactivated successfully.
Dec  6 04:43:59 np0005548916 podman[89650]: 2025-12-06 09:43:59.913242819 +0000 UTC m=+0.238732074 container remove 37fe98652af2394c8044d3aff43021e8a09b95485a8991619a9fb9d6bc2c043d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=frosty_edison, ceph=True, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS)
Dec  6 04:43:59 np0005548916 systemd[1]: libpod-conmon-37fe98652af2394c8044d3aff43021e8a09b95485a8991619a9fb9d6bc2c043d.scope: Deactivated successfully.
Dec  6 04:44:00 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:44:00 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:44:00 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:44:00.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:44:00 np0005548916 ceph-mon[79770]: Reconfiguring crash.compute-1 (monmap changed)...
Dec  6 04:44:00 np0005548916 ceph-mon[79770]: Reconfiguring daemon crash.compute-1 on compute-1
Dec  6 04:44:00 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "17"}]: dispatch
Dec  6 04:44:00 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "17"}]': finished
Dec  6 04:44:00 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e109 e109: 3 total, 3 up, 3 in
Dec  6 04:44:00 np0005548916 podman[89753]: 2025-12-06 09:44:00.963743658 +0000 UTC m=+0.056563984 container create 943b0aaa17a4367da058ca27a36c4be50f678046646619f95b30232557802b47 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=awesome_torvalds, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True)
Dec  6 04:44:01 np0005548916 systemd[1]: Started libpod-conmon-943b0aaa17a4367da058ca27a36c4be50f678046646619f95b30232557802b47.scope.
Dec  6 04:44:01 np0005548916 podman[89753]: 2025-12-06 09:44:00.936736909 +0000 UTC m=+0.029557315 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  6 04:44:01 np0005548916 systemd[1]: Started libcrun container.
Dec  6 04:44:01 np0005548916 podman[89753]: 2025-12-06 09:44:01.065957937 +0000 UTC m=+0.158778343 container init 943b0aaa17a4367da058ca27a36c4be50f678046646619f95b30232557802b47 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=awesome_torvalds, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.build-date=20250325)
Dec  6 04:44:01 np0005548916 podman[89753]: 2025-12-06 09:44:01.079100903 +0000 UTC m=+0.171921229 container start 943b0aaa17a4367da058ca27a36c4be50f678046646619f95b30232557802b47 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=awesome_torvalds, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, ceph=True, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  6 04:44:01 np0005548916 podman[89753]: 2025-12-06 09:44:01.082460308 +0000 UTC m=+0.175280724 container attach 943b0aaa17a4367da058ca27a36c4be50f678046646619f95b30232557802b47 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=awesome_torvalds, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  6 04:44:01 np0005548916 awesome_torvalds[89769]: 167 167
Dec  6 04:44:01 np0005548916 systemd[1]: libpod-943b0aaa17a4367da058ca27a36c4be50f678046646619f95b30232557802b47.scope: Deactivated successfully.
Dec  6 04:44:01 np0005548916 podman[89753]: 2025-12-06 09:44:01.088302497 +0000 UTC m=+0.181122883 container died 943b0aaa17a4367da058ca27a36c4be50f678046646619f95b30232557802b47 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=awesome_torvalds, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, ceph=True, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec  6 04:44:01 np0005548916 systemd[1]: var-lib-containers-storage-overlay-414f91675d0f2345b476b78ddef5cde869913f88bf3577b859b96b96ad00ca4c-merged.mount: Deactivated successfully.
Dec  6 04:44:01 np0005548916 podman[89753]: 2025-12-06 09:44:01.150963417 +0000 UTC m=+0.243783783 container remove 943b0aaa17a4367da058ca27a36c4be50f678046646619f95b30232557802b47 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=awesome_torvalds, CEPH_REF=squid, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec  6 04:44:01 np0005548916 systemd[1]: libpod-conmon-943b0aaa17a4367da058ca27a36c4be50f678046646619f95b30232557802b47.scope: Deactivated successfully.
Dec  6 04:44:01 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:44:01 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:44:01 np0005548916 ceph-mon[79770]: Reconfiguring osd.0 (monmap changed)...
Dec  6 04:44:01 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "osd.0"}]: dispatch
Dec  6 04:44:01 np0005548916 ceph-mon[79770]: Reconfiguring daemon osd.0 on compute-1
Dec  6 04:44:01 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:44:01 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:44:01 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:44:01 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:44:01.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:44:01 np0005548916 podman[89862]: 2025-12-06 09:44:01.952053082 +0000 UTC m=+0.048354486 container create dadc92978cf546aaf08fc8c06c02e30483714e4ef7350cf9c57febfd630246be (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=happy_jemison, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, ceph=True, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  6 04:44:01 np0005548916 systemd[1]: Started libpod-conmon-dadc92978cf546aaf08fc8c06c02e30483714e4ef7350cf9c57febfd630246be.scope.
Dec  6 04:44:02 np0005548916 systemd[1]: Started libcrun container.
Dec  6 04:44:02 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:44:02 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:44:02 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:44:02.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:44:02 np0005548916 podman[89862]: 2025-12-06 09:44:01.932742819 +0000 UTC m=+0.029044253 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  6 04:44:02 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e110 e110: 3 total, 3 up, 3 in
Dec  6 04:44:02 np0005548916 podman[89862]: 2025-12-06 09:44:02.035680716 +0000 UTC m=+0.131982140 container init dadc92978cf546aaf08fc8c06c02e30483714e4ef7350cf9c57febfd630246be (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=happy_jemison, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  6 04:44:02 np0005548916 podman[89862]: 2025-12-06 09:44:02.043129016 +0000 UTC m=+0.139430410 container start dadc92978cf546aaf08fc8c06c02e30483714e4ef7350cf9c57febfd630246be (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=happy_jemison, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1)
Dec  6 04:44:02 np0005548916 podman[89862]: 2025-12-06 09:44:02.046477051 +0000 UTC m=+0.142778455 container attach dadc92978cf546aaf08fc8c06c02e30483714e4ef7350cf9c57febfd630246be (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=happy_jemison, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec  6 04:44:02 np0005548916 happy_jemison[89878]: 167 167
Dec  6 04:44:02 np0005548916 systemd[1]: libpod-dadc92978cf546aaf08fc8c06c02e30483714e4ef7350cf9c57febfd630246be.scope: Deactivated successfully.
Dec  6 04:44:02 np0005548916 podman[89862]: 2025-12-06 09:44:02.048927204 +0000 UTC m=+0.145228608 container died dadc92978cf546aaf08fc8c06c02e30483714e4ef7350cf9c57febfd630246be (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=happy_jemison, ceph=True, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  6 04:44:02 np0005548916 systemd[1]: var-lib-containers-storage-overlay-1f482cac385e65d30b3ff90a2272b3416c895fc4d05d57a7f4b44cfc5ab5b3e7-merged.mount: Deactivated successfully.
Dec  6 04:44:02 np0005548916 podman[89862]: 2025-12-06 09:44:02.196240714 +0000 UTC m=+0.292542118 container remove dadc92978cf546aaf08fc8c06c02e30483714e4ef7350cf9c57febfd630246be (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=happy_jemison, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, ceph=True, OSD_FLAVOR=default, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Dec  6 04:44:02 np0005548916 systemd[1]: libpod-conmon-dadc92978cf546aaf08fc8c06c02e30483714e4ef7350cf9c57febfd630246be.scope: Deactivated successfully.
Dec  6 04:44:02 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:44:02 np0005548916 ceph-mon[79770]: Reconfiguring mon.compute-1 (monmap changed)...
Dec  6 04:44:02 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Dec  6 04:44:02 np0005548916 ceph-mon[79770]: Reconfiguring daemon mon.compute-1 on compute-1
Dec  6 04:44:02 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "18"}]: dispatch
Dec  6 04:44:02 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "18"}]': finished
Dec  6 04:44:02 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:44:02 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:44:02 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Dec  6 04:44:02 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  6 04:44:03 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e111 e111: 3 total, 3 up, 3 in
Dec  6 04:44:03 np0005548916 ceph-mon[79770]: Reconfiguring mon.compute-2 (monmap changed)...
Dec  6 04:44:03 np0005548916 ceph-mon[79770]: Reconfiguring daemon mon.compute-2 on compute-2
Dec  6 04:44:03 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:44:03 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:44:03 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:44:03.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:44:04 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:44:04 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:44:04 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:44:04.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:44:04 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e112 e112: 3 total, 3 up, 3 in
Dec  6 04:44:04 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 112 pg[10.12( v 51'1027 (0'0,51'1027] local-lis/les=68/69 n=4 ec=58/45 lis/c=68/68 les/c/f=69/69/0 sis=112 pruub=10.954301834s) [2] r=-1 lpr=112 pi=[68,112)/1 crt=51'1027 mlcod 0'0 active pruub 279.412811279s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:44:04 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 112 pg[10.12( v 51'1027 (0'0,51'1027] local-lis/les=68/69 n=4 ec=58/45 lis/c=68/68 les/c/f=69/69/0 sis=112 pruub=10.954230309s) [2] r=-1 lpr=112 pi=[68,112)/1 crt=51'1027 mlcod 0'0 unknown NOTIFY pruub 279.412811279s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:44:04 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:44:04 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:44:04 np0005548916 ceph-mon[79770]: Reconfiguring mgr.compute-2.oazbvn (monmap changed)...
Dec  6 04:44:04 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.compute-2.oazbvn", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch
Dec  6 04:44:04 np0005548916 ceph-mon[79770]: Reconfiguring daemon mgr.compute-2.oazbvn on compute-2
Dec  6 04:44:04 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "19"}]: dispatch
Dec  6 04:44:04 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:44:04 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:44:04 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "dashboard set-grafana-api-url", "value": "https://192.168.122.100:3000"}]: dispatch
Dec  6 04:44:04 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:44:04 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "19"}]': finished
Dec  6 04:44:04 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:04 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  6 04:44:04 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:04 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Dec  6 04:44:04 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:04 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Dec  6 04:44:04 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:04 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Dec  6 04:44:04 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:04 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Dec  6 04:44:04 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:04 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Dec  6 04:44:04 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:04 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Dec  6 04:44:04 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:04 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  6 04:44:04 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:04 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  6 04:44:04 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:04 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  6 04:44:04 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:04 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Dec  6 04:44:04 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:04 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  6 04:44:04 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:04 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Dec  6 04:44:04 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:04 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Dec  6 04:44:04 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:04 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Dec  6 04:44:04 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:04 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Dec  6 04:44:04 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:04 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Dec  6 04:44:04 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:04 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Dec  6 04:44:04 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:04 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Dec  6 04:44:04 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:04 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Dec  6 04:44:04 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:04 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Dec  6 04:44:04 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:04 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Dec  6 04:44:04 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:04 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Dec  6 04:44:04 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:04 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Dec  6 04:44:04 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:04 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec  6 04:44:04 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:04 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Dec  6 04:44:04 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:04 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec  6 04:44:04 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:04 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd34c000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:05 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:05 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd3480013b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:05 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e113 e113: 3 total, 3 up, 3 in
Dec  6 04:44:05 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 113 pg[10.12( v 51'1027 (0'0,51'1027] local-lis/les=68/69 n=4 ec=58/45 lis/c=68/68 les/c/f=69/69/0 sis=113) [2]/[0] r=0 lpr=113 pi=[68,113)/1 crt=51'1027 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:44:05 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 113 pg[10.12( v 51'1027 (0'0,51'1027] local-lis/les=68/69 n=4 ec=58/45 lis/c=68/68 les/c/f=69/69/0 sis=113) [2]/[0] r=0 lpr=113 pi=[68,113)/1 crt=51'1027 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Dec  6 04:44:05 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:44:05 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:44:05 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:44:05.832 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:44:05 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:05 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd324000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:06 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:44:06 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:44:06 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:44:06.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:44:06 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "20"}]: dispatch
Dec  6 04:44:06 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e114 e114: 3 total, 3 up, 3 in
Dec  6 04:44:06 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 114 pg[10.13( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=65/65 les/c/f=66/66/0 sis=114) [0] r=0 lpr=114 pi=[65,114)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:44:06 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 114 pg[10.12( v 51'1027 (0'0,51'1027] local-lis/les=113/114 n=4 ec=58/45 lis/c=68/68 les/c/f=69/69/0 sis=113) [2]/[0] async=[2] r=0 lpr=113 pi=[68,113)/1 crt=51'1027 mlcod 0'0 active+remapped mbc={255={(0+1)=4}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:44:06 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:06 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd31c000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:07 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/094407 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  6 04:44:07 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:07 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd31c000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:07 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "20"}]': finished
Dec  6 04:44:07 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e115 e115: 3 total, 3 up, 3 in
Dec  6 04:44:07 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 115 pg[10.12( v 51'1027 (0'0,51'1027] local-lis/les=113/114 n=4 ec=58/45 lis/c=113/68 les/c/f=114/69/0 sis=115 pruub=14.989471436s) [2] async=[2] r=-1 lpr=115 pi=[68,115)/1 crt=51'1027 mlcod 51'1027 active pruub 286.513122559s@ mbc={255={}}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:44:07 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 115 pg[10.12( v 51'1027 (0'0,51'1027] local-lis/les=113/114 n=4 ec=58/45 lis/c=113/68 les/c/f=114/69/0 sis=115 pruub=14.989409447s) [2] r=-1 lpr=115 pi=[68,115)/1 crt=51'1027 mlcod 0'0 unknown NOTIFY pruub 286.513122559s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:44:07 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 115 pg[10.13( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=65/65 les/c/f=66/66/0 sis=115) [0]/[2] r=-1 lpr=115 pi=[65,115)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:44:07 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 115 pg[10.13( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=65/65 les/c/f=66/66/0 sis=115) [0]/[2] r=-1 lpr=115 pi=[65,115)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec  6 04:44:07 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:44:07 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:44:07 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:44:07.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:44:07 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e115 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  6 04:44:07 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:07 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd3480020b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:08 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:44:08 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:44:08 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:44:08.031 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:44:08 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e116 e116: 3 total, 3 up, 3 in
Dec  6 04:44:08 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:08 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd3240016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:09 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:09 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd31c000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:09 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:44:09 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:44:09 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  6 04:44:09 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:44:09 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:44:09 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  6 04:44:09 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:44:09 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:44:09 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:44:09.840 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:44:09 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:09 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd3280012e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:10 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:44:10 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:44:10 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:44:10.034 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:44:10 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e117 e117: 3 total, 3 up, 3 in
Dec  6 04:44:10 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 117 pg[10.13( v 51'1027 (0'0,51'1027] local-lis/les=0/0 n=5 ec=58/45 lis/c=115/65 les/c/f=116/66/0 sis=117) [0] r=0 lpr=117 pi=[65,117)/1 luod=0'0 crt=51'1027 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:44:10 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 117 pg[10.13( v 51'1027 (0'0,51'1027] local-lis/les=0/0 n=5 ec=58/45 lis/c=115/65 les/c/f=116/66/0 sis=117) [0] r=0 lpr=117 pi=[65,117)/1 crt=51'1027 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:44:10 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:10 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd3280012e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:11 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e118 e118: 3 total, 3 up, 3 in
Dec  6 04:44:11 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 118 pg[10.13( v 51'1027 (0'0,51'1027] local-lis/les=117/118 n=5 ec=58/45 lis/c=115/65 les/c/f=116/66/0 sis=117) [0] r=0 lpr=117 pi=[65,117)/1 crt=51'1027 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:44:11 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:11 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd3280012e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:11 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:44:11 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:44:11 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:44:11.843 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:44:11 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:11 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd3240016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:12 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:44:12 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:44:12 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:44:12.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:44:12 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:12 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd3240016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:12 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e118 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  6 04:44:13 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:13 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd3240016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:13 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:44:13 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:44:13 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:44:13.846 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:44:13 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:13 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd328002720 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:14 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:44:14 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:44:14 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:44:14.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:44:14 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e119 e119: 3 total, 3 up, 3 in
Dec  6 04:44:14 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 119 pg[10.14( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=74/74 les/c/f=75/75/0 sis=119) [0] r=0 lpr=119 pi=[74,119)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:44:14 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "21"}]: dispatch
Dec  6 04:44:14 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:14 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd318000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:15 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:15 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd3240016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:15 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e120 e120: 3 total, 3 up, 3 in
Dec  6 04:44:15 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 120 pg[10.14( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=74/74 les/c/f=75/75/0 sis=120) [0]/[2] r=-1 lpr=120 pi=[74,120)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:44:15 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 120 pg[10.14( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=74/74 les/c/f=75/75/0 sis=120) [0]/[2] r=-1 lpr=120 pi=[74,120)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec  6 04:44:15 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "21"}]': finished
Dec  6 04:44:15 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:44:15 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:44:15 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:44:15 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:44:15 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:44:15.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:44:15 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:15 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd3240016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:16 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:44:16 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:44:16 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:44:16.051 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:44:16 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "22"}]: dispatch
Dec  6 04:44:16 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e121 e121: 3 total, 3 up, 3 in
Dec  6 04:44:16 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:16 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd328002720 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:17 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:17 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd3180016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:17 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "22"}]': finished
Dec  6 04:44:17 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e122 e122: 3 total, 3 up, 3 in
Dec  6 04:44:17 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 122 pg[10.14( v 51'1027 (0'0,51'1027] local-lis/les=0/0 n=5 ec=58/45 lis/c=120/74 les/c/f=121/75/0 sis=122) [0] r=0 lpr=122 pi=[74,122)/1 luod=0'0 crt=51'1027 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:44:17 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 122 pg[10.14( v 51'1027 (0'0,51'1027] local-lis/les=0/0 n=5 ec=58/45 lis/c=120/74 les/c/f=121/75/0 sis=122) [0] r=0 lpr=122 pi=[74,122)/1 crt=51'1027 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:44:17 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:44:17 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:44:17 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:44:17.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:44:17 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e122 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:44:17 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:17 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd3240016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:18 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:44:18 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 04:44:18 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:44:18.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 04:44:18 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e123 e123: 3 total, 3 up, 3 in
Dec  6 04:44:18 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "23"}]: dispatch
Dec  6 04:44:18 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 123 pg[10.14( v 51'1027 (0'0,51'1027] local-lis/les=122/123 n=5 ec=58/45 lis/c=120/74 les/c/f=121/75/0 sis=122) [0] r=0 lpr=122 pi=[74,122)/1 crt=51'1027 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:44:18 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:18 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd3240016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:19 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:19 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd328002720 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:19 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:44:19 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:44:19 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:44:19.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:44:19 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:19 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd3180016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:20 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:44:20 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:44:20 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:44:20.072 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:44:20 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:20 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd3240016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:21 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:21 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd3240016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:21 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "23"}]': finished
Dec  6 04:44:21 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e124 e124: 3 total, 3 up, 3 in
Dec  6 04:44:21 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:44:21 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:44:21 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:44:21.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:44:21 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:21 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd328003a70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:22 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:44:22 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:44:22 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:44:22.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:44:22 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "24"}]: dispatch
Dec  6 04:44:22 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "24"}]: dispatch
Dec  6 04:44:22 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "24"}]': finished
Dec  6 04:44:22 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:22 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd3180016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:22 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e125 e125: 3 total, 3 up, 3 in
Dec  6 04:44:22 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e125 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:44:23 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:23 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd3240016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:23 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "24"}]': finished
Dec  6 04:44:23 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "25"}]: dispatch
Dec  6 04:44:23 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:44:23 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:44:23 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:44:23.865 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:44:23 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e126 e126: 3 total, 3 up, 3 in
Dec  6 04:44:23 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:23 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd3240016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:24 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:44:24 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:44:24 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:44:24.078 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:44:24 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:24 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd328003a70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:24 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "25"}]': finished
Dec  6 04:44:25 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:25 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd318002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:25 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:44:25 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:44:25 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:44:25.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:44:25 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "26"}]: dispatch
Dec  6 04:44:25 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e127 e127: 3 total, 3 up, 3 in
Dec  6 04:44:25 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 127 pg[10.19( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=89/89 les/c/f=90/90/0 sis=127) [0] r=0 lpr=127 pi=[89,127)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:44:25 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:25 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd3240016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:26 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:44:26 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:44:26 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:44:26.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:44:26 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:26 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd3240016a0 fd 38 proxy ignored for local
Dec  6 04:44:26 np0005548916 kernel: ganesha.nfsd[89936]: segfault at 50 ip 00007fd3f967032e sp 00007fd3b17f9210 error 4 in libntirpc.so.5.8[7fd3f9655000+2c000] likely on CPU 1 (core 0, socket 1)
Dec  6 04:44:26 np0005548916 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Dec  6 04:44:26 np0005548916 systemd[1]: Started Process Core Dump (PID 90075/UID 0).
Dec  6 04:44:26 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "26"}]': finished
Dec  6 04:44:26 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e128 e128: 3 total, 3 up, 3 in
Dec  6 04:44:27 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 128 pg[10.19( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=89/89 les/c/f=90/90/0 sis=128) [0]/[1] r=-1 lpr=128 pi=[89,128)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:44:27 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 128 pg[10.19( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=89/89 les/c/f=90/90/0 sis=128) [0]/[1] r=-1 lpr=128 pi=[89,128)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec  6 04:44:27 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:44:27 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:44:27 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:44:27.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:44:27 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:44:28 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e129 e129: 3 total, 3 up, 3 in
Dec  6 04:44:28 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:44:28 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 04:44:28 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:44:28.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 04:44:28 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e130 e130: 3 total, 3 up, 3 in
Dec  6 04:44:28 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 130 pg[10.19( v 51'1027 (0'0,51'1027] local-lis/les=0/0 n=7 ec=58/45 lis/c=128/89 les/c/f=129/90/0 sis=130) [0] r=0 lpr=130 pi=[89,130)/1 luod=0'0 crt=51'1027 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:44:28 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 130 pg[10.19( v 51'1027 (0'0,51'1027] local-lis/les=0/0 n=7 ec=58/45 lis/c=128/89 les/c/f=129/90/0 sis=130) [0] r=0 lpr=130 pi=[89,130)/1 crt=51'1027 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:44:29 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:44:29 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:44:29 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:44:29.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:44:29 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e131 e131: 3 total, 3 up, 3 in
Dec  6 04:44:29 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 131 pg[10.19( v 51'1027 (0'0,51'1027] local-lis/les=130/131 n=7 ec=58/45 lis/c=128/89 les/c/f=129/90/0 sis=130) [0] r=0 lpr=130 pi=[89,130)/1 crt=51'1027 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:44:30 np0005548916 systemd-coredump[90076]: Process 88957 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 54:#012#0  0x00007fd3f967032e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Dec  6 04:44:30 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:44:30 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:44:30 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:44:30.089 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:44:30 np0005548916 systemd[1]: systemd-coredump@1-90075-0.service: Deactivated successfully.
Dec  6 04:44:30 np0005548916 systemd[1]: systemd-coredump@1-90075-0.service: Consumed 3.160s CPU time.
Dec  6 04:44:30 np0005548916 podman[90082]: 2025-12-06 09:44:30.235684416 +0000 UTC m=+0.042545541 container died 90da3924cacc3efa403ea45549b6824092497428ac34121bd882ee64a78e789c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid)
Dec  6 04:44:30 np0005548916 systemd[1]: var-lib-containers-storage-overlay-f61a3b3a24536c7c4e747ec6bc4b7c9a3e3b3c6a417aa3b9f3cf3ab295997eb4-merged.mount: Deactivated successfully.
Dec  6 04:44:30 np0005548916 podman[90082]: 2025-12-06 09:44:30.396969354 +0000 UTC m=+0.203830449 container remove 90da3924cacc3efa403ea45549b6824092497428ac34121bd882ee64a78e789c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, CEPH_REF=squid, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, ceph=True, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  6 04:44:30 np0005548916 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.0.0.compute-1.djsnbu.service: Main process exited, code=exited, status=139/n/a
Dec  6 04:44:30 np0005548916 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.0.0.compute-1.djsnbu.service: Failed with result 'exit-code'.
Dec  6 04:44:30 np0005548916 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.0.0.compute-1.djsnbu.service: Consumed 2.668s CPU time.
Dec  6 04:44:31 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:44:31 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:44:31 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:44:31.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:44:32 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:44:32 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:44:32 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:44:32.092 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:44:32 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e131 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:44:33 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:44:33 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:44:33 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:44:33.883 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:44:34 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:44:34 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:44:34 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:44:34.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:44:34 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e132 e132: 3 total, 3 up, 3 in
Dec  6 04:44:34 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "27"}]: dispatch
Dec  6 04:44:35 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "27"}]': finished
Dec  6 04:44:35 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/094435 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  6 04:44:35 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:44:35 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:44:35 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:44:35.887 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:44:36 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:44:36 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:44:36 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:44:36.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:44:36 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "28"}]: dispatch
Dec  6 04:44:36 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e133 e133: 3 total, 3 up, 3 in
Dec  6 04:44:36 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 133 pg[10.1b( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=97/97 les/c/f=98/98/0 sis=133) [0] r=0 lpr=133 pi=[97,133)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:44:37 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "28"}]': finished
Dec  6 04:44:37 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e134 e134: 3 total, 3 up, 3 in
Dec  6 04:44:37 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 134 pg[10.1b( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=97/97 les/c/f=98/98/0 sis=134) [0]/[1] r=-1 lpr=134 pi=[97,134)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:44:37 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 134 pg[10.1b( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=97/97 les/c/f=98/98/0 sis=134) [0]/[1] r=-1 lpr=134 pi=[97,134)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec  6 04:44:37 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:44:37 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:44:37 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:44:37.892 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:44:37 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:44:38 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:44:38 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:44:38 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:44:38.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:44:38 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e135 e135: 3 total, 3 up, 3 in
Dec  6 04:44:39 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e136 e136: 3 total, 3 up, 3 in
Dec  6 04:44:39 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 136 pg[10.1b( v 51'1027 (0'0,51'1027] local-lis/les=0/0 n=2 ec=58/45 lis/c=134/97 les/c/f=135/98/0 sis=136) [0] r=0 lpr=136 pi=[97,136)/1 luod=0'0 crt=51'1027 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:44:39 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 136 pg[10.1b( v 51'1027 (0'0,51'1027] local-lis/les=0/0 n=2 ec=58/45 lis/c=134/97 les/c/f=135/98/0 sis=136) [0] r=0 lpr=136 pi=[97,136)/1 crt=51'1027 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:44:39 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:44:39 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:44:39 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:44:39.896 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:44:40 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:44:40 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:44:40 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:44:40.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:44:40 np0005548916 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.0.0.compute-1.djsnbu.service: Scheduled restart job, restart counter is at 2.
Dec  6 04:44:40 np0005548916 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.djsnbu for 5ecd3f74-dade-5fc4-92ce-8950ae424258.
Dec  6 04:44:40 np0005548916 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.0.0.compute-1.djsnbu.service: Consumed 2.668s CPU time.
Dec  6 04:44:40 np0005548916 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.djsnbu for 5ecd3f74-dade-5fc4-92ce-8950ae424258...
Dec  6 04:44:40 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e137 e137: 3 total, 3 up, 3 in
Dec  6 04:44:40 np0005548916 ceph-osd[77465]: osd.0 pg_epoch: 137 pg[10.1b( v 51'1027 (0'0,51'1027] local-lis/les=136/137 n=2 ec=58/45 lis/c=134/97 les/c/f=135/98/0 sis=136) [0] r=0 lpr=136 pi=[97,136)/1 crt=51'1027 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:44:41 np0005548916 podman[90180]: 2025-12-06 09:44:41.013563207 +0000 UTC m=+0.107260239 container create 490bcdc1ddf2a147605f7bef7763287ae9d25da8b09ab41fcfcd1cec65c24755 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  6 04:44:41 np0005548916 podman[90180]: 2025-12-06 09:44:40.931822695 +0000 UTC m=+0.025519777 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  6 04:44:41 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0bbbd98a88994f54839b8379f302a87baf27efd11c17b9c4f84aad6e60a7f0d8/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Dec  6 04:44:41 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0bbbd98a88994f54839b8379f302a87baf27efd11c17b9c4f84aad6e60a7f0d8/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  6 04:44:41 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0bbbd98a88994f54839b8379f302a87baf27efd11c17b9c4f84aad6e60a7f0d8/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Dec  6 04:44:41 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0bbbd98a88994f54839b8379f302a87baf27efd11c17b9c4f84aad6e60a7f0d8/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.djsnbu-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Dec  6 04:44:41 np0005548916 podman[90180]: 2025-12-06 09:44:41.094232492 +0000 UTC m=+0.187929544 container init 490bcdc1ddf2a147605f7bef7763287ae9d25da8b09ab41fcfcd1cec65c24755 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  6 04:44:41 np0005548916 podman[90180]: 2025-12-06 09:44:41.100138716 +0000 UTC m=+0.193835748 container start 490bcdc1ddf2a147605f7bef7763287ae9d25da8b09ab41fcfcd1cec65c24755 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1)
Dec  6 04:44:41 np0005548916 bash[90180]: 490bcdc1ddf2a147605f7bef7763287ae9d25da8b09ab41fcfcd1cec65c24755
Dec  6 04:44:41 np0005548916 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.djsnbu for 5ecd3f74-dade-5fc4-92ce-8950ae424258.
Dec  6 04:44:41 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:44:41 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Dec  6 04:44:41 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:44:41 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Dec  6 04:44:41 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:44:41 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Dec  6 04:44:41 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:44:41 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Dec  6 04:44:41 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:44:41 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Dec  6 04:44:41 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:44:41 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Dec  6 04:44:41 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:44:41 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Dec  6 04:44:41 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:44:41 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 04:44:41 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:44:41 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:44:41 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:44:41.899 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:44:42 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:44:42 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:44:42 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:44:42.109 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:44:42 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:44:43 np0005548916 python3.9[90390]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:44:43 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:44:43 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:44:43 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:44:43.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:44:44 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:44:44 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:44:44 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:44:44.112 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:44:44 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "29"}]: dispatch
Dec  6 04:44:44 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e138 e138: 3 total, 3 up, 3 in
Dec  6 04:44:45 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "29"}]': finished
Dec  6 04:44:45 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:44:45 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 04:44:45 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:44:45.908 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 04:44:46 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:44:46 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:44:46 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:44:46.117 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:44:46 np0005548916 python3.9[90703]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Dec  6 04:44:46 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "30"}]: dispatch
Dec  6 04:44:46 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e139 e139: 3 total, 3 up, 3 in
Dec  6 04:44:47 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:44:47 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 04:44:47 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:44:47 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 04:44:47 np0005548916 python3.9[90856]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Dec  6 04:44:47 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "30"}]': finished
Dec  6 04:44:47 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:44:47 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:44:47 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:44:47 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:44:47.911 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:44:48 np0005548916 python3.9[91008]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:44:48 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:44:48 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:44:48 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:44:48.120 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:44:48 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "31"}]: dispatch
Dec  6 04:44:48 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e140 e140: 3 total, 3 up, 3 in
Dec  6 04:44:48 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e141 e141: 3 total, 3 up, 3 in
Dec  6 04:44:49 np0005548916 python3.9[91161]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Dec  6 04:44:49 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "31"}]': finished
Dec  6 04:44:49 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e142 e142: 3 total, 3 up, 3 in
Dec  6 04:44:49 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:44:49 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:44:49 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:44:49.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:44:50 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:44:50 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:44:50 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:44:50.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:44:50 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "32"}]: dispatch
Dec  6 04:44:50 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "32"}]': finished
Dec  6 04:44:50 np0005548916 python3.9[91314]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:44:50 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e143 e143: 3 total, 3 up, 3 in
Dec  6 04:44:51 np0005548916 python3.9[91466]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:44:51 np0005548916 python3.9[91544]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem _original_basename=tls-ca-bundle.pem recurse=False state=file path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:44:51 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:44:51 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:44:51 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:44:51.918 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:44:52 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e144 e144: 3 total, 3 up, 3 in
Dec  6 04:44:52 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:44:52 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:44:52 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:44:52.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:44:52 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:44:53 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e145 e145: 3 total, 3 up, 3 in
Dec  6 04:44:53 np0005548916 python3.9[91697]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  6 04:44:53 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:44:53 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  6 04:44:53 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:44:53 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Dec  6 04:44:53 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:44:53 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Dec  6 04:44:53 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:44:53 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Dec  6 04:44:53 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:44:53 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Dec  6 04:44:53 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:44:53 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Dec  6 04:44:53 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:44:53 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Dec  6 04:44:53 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:44:53 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  6 04:44:53 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:44:53 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  6 04:44:53 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:44:53 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  6 04:44:53 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:44:53 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Dec  6 04:44:53 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:44:53 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  6 04:44:53 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:44:53 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Dec  6 04:44:53 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:44:53 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Dec  6 04:44:53 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:44:53 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Dec  6 04:44:53 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:44:53 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Dec  6 04:44:53 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:44:53 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Dec  6 04:44:53 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:44:53 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Dec  6 04:44:53 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:44:53 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Dec  6 04:44:53 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:44:53 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Dec  6 04:44:53 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:44:53 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Dec  6 04:44:53 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:44:53 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Dec  6 04:44:53 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:44:53 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Dec  6 04:44:53 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:44:53 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Dec  6 04:44:53 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:44:53 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec  6 04:44:53 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:44:53 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Dec  6 04:44:53 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:44:53 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec  6 04:44:53 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:44:53 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:44:53 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:44:53.921 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:44:53 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:44:53 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b0000df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:54 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 e146: 3 total, 3 up, 3 in
Dec  6 04:44:54 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:44:54 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:44:54 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:44:54.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:44:54 np0005548916 python3.9[91868]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Dec  6 04:44:54 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:44:54 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa198000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:55 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:44:55 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa188000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:55 np0005548916 python3.9[92021]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Dec  6 04:44:55 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:44:55 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:44:55 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:44:55.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:44:55 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:44:55 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa180000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:56 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:44:56 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:44:56 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:44:56.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:44:56 np0005548916 python3.9[92175]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec  6 04:44:56 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:44:56 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa18c000fa0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:57 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/094457 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  6 04:44:57 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:44:57 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa198001680 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:57 np0005548916 python3.9[92327]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Dec  6 04:44:57 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:44:57 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:44:57 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:44:57 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:44:57.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:44:57 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:44:57 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1880016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:58 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:44:58 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:44:58 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:44:58.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:44:58 np0005548916 python3.9[92480]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  6 04:44:58 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:44:58 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1800016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:59 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:44:59 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa18c001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:59 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:44:59 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:44:59 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:44:59.932 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:44:59 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:44:59 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa18c001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:00 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:45:00 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:45:00 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:45:00.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:45:00 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:00 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1880016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:01 np0005548916 python3.9[92634]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:45:01 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:01 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1800016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:01 np0005548916 python3.9[92786]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:45:01 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:45:01 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:45:01 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:45:01.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:45:01 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:01 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa198001fa0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:02 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:45:02 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 04:45:02 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:45:02.143 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 04:45:02 np0005548916 python3.9[92864]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/modules-load.d/99-edpm.conf _original_basename=edpm-modprobe.conf.j2 recurse=False state=file path=/etc/modules-load.d/99-edpm.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:45:02 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:02 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa18c001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:02 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:45:03 np0005548916 python3.9[93017]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:45:03 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:03 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1880016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:03 np0005548916 python3.9[93095]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/sysctl.d/99-edpm.conf _original_basename=edpm-sysctl.conf.j2 recurse=False state=file path=/etc/sysctl.d/99-edpm.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:45:03 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:45:03 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:45:03 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:45:03.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:45:03 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:03 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1800016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:04 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:45:04 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:45:04 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:45:04.147 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:45:04 np0005548916 python3.9[93248]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  6 04:45:04 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:04 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa198001fa0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:05 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:05 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa18c001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:05 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:45:05 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:45:05 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:45:05.944 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:45:05 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:05 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa188002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:06 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:45:06 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:45:06 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:45:06.151 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:45:06 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:06 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa180002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:07 np0005548916 python3.9[93425]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  6 04:45:07 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:07 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa198001fa0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:07 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:45:07 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:45:07 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:45:07 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:45:07.946 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:45:07 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:07 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa18c003340 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:08 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:45:08 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:45:08 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:45:08.155 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:45:08 np0005548916 python3.9[93577]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Dec  6 04:45:08 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:08 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa188002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:08 np0005548916 python3.9[93728]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  6 04:45:09 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:09 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa198001fa0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:09 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:45:09 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:45:09 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:45:09.950 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:45:09 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:09 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa198001fa0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:10 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:45:10 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:45:10 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:45:10.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:45:10 np0005548916 python3.9[93881]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  6 04:45:10 np0005548916 systemd[1]: Stopping Dynamic System Tuning Daemon...
Dec  6 04:45:10 np0005548916 systemd[1]: tuned.service: Deactivated successfully.
Dec  6 04:45:10 np0005548916 systemd[1]: Stopped Dynamic System Tuning Daemon.
Dec  6 04:45:10 np0005548916 systemd[1]: Starting Dynamic System Tuning Daemon...
Dec  6 04:45:10 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:10 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa18c003340 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:11 np0005548916 systemd[1]: Started Dynamic System Tuning Daemon.
Dec  6 04:45:11 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:11 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa18c003340 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:11 np0005548916 python3.9[94043]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Dec  6 04:45:11 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:45:11 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:45:11 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:45:11.953 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:45:11 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:11 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b0000df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:12 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:45:12 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:45:12 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:45:12.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:45:12 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:12 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa198001fa0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:12 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:45:13 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:13 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa198001fa0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:13 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:45:13 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:45:13 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:45:13.956 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:45:13 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:13 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa180003430 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:14 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:45:14 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:45:14 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:45:14.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:45:14 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:14 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa180003430 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:15 np0005548916 python3.9[94197]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  6 04:45:15 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:15 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa18c004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:15 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:45:15 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:45:15 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:45:15.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:45:15 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:15 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b00021f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:16 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:45:16 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:45:16 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:45:16.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:45:16 np0005548916 python3.9[94401]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  6 04:45:16 np0005548916 systemd[1]: session-38.scope: Deactivated successfully.
Dec  6 04:45:16 np0005548916 systemd[1]: session-38.scope: Consumed 1min 11.601s CPU time.
Dec  6 04:45:16 np0005548916 systemd-logind[788]: Session 38 logged out. Waiting for processes to exit.
Dec  6 04:45:16 np0005548916 systemd-logind[788]: Removed session 38.
Dec  6 04:45:16 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:16 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa180003430 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:17 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:17 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa198003ad0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:17 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:45:17 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:45:17 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:45:17 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:45:17.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:45:17 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:17 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa18c004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:18 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/094518 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  6 04:45:18 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:45:18 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 04:45:18 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:45:18.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 04:45:18 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:18 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b0002390 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:19 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:45:19 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:45:19 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:19 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa180003430 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:19 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:45:19 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:45:19 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:45:19.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:45:19 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:19 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa198003ad0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:20 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:45:20 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:45:20 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:45:20.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:45:20 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  6 04:45:20 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:45:20 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:45:20 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  6 04:45:20 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:20 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa18c004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:21 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:21 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa18c004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:21 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:45:21 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:45:21 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:45:21.971 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:45:21 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:21 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa180003430 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:22 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:45:22 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:45:22 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:45:22.181 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:45:22 np0005548916 systemd-logind[788]: New session 39 of user zuul.
Dec  6 04:45:22 np0005548916 systemd[1]: Started Session 39 of User zuul.
Dec  6 04:45:22 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:22 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa198003ad0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:22 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:45:23 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:23 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b00091b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:23 np0005548916 python3.9[94616]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  6 04:45:23 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:45:23 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:45:23 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:45:23.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:45:23 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:23 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa18c004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:24 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:45:24 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:45:24 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:45:24.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:45:24 np0005548916 ceph-mon[79770]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #19. Immutable memtables: 0.
Dec  6 04:45:24 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:45:24.472824) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec  6 04:45:24 np0005548916 ceph-mon[79770]: rocksdb: [db/flush_job.cc:856] [default] [JOB 7] Flushing memtable with next log file: 19
Dec  6 04:45:24 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014324473031, "job": 7, "event": "flush_started", "num_memtables": 1, "num_entries": 2961, "num_deletes": 252, "total_data_size": 10711815, "memory_usage": 11054624, "flush_reason": "Manual Compaction"}
Dec  6 04:45:24 np0005548916 ceph-mon[79770]: rocksdb: [db/flush_job.cc:885] [default] [JOB 7] Level-0 flush table #20: started
Dec  6 04:45:24 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014324519492, "cf_name": "default", "job": 7, "event": "table_file_creation", "file_number": 20, "file_size": 6722042, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 7705, "largest_seqno": 10661, "table_properties": {"data_size": 6708894, "index_size": 8490, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3589, "raw_key_size": 31688, "raw_average_key_size": 22, "raw_value_size": 6680720, "raw_average_value_size": 4688, "num_data_blocks": 370, "num_entries": 1425, "num_filter_entries": 1425, "num_deletions": 252, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765014196, "oldest_key_time": 1765014196, "file_creation_time": 1765014324, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79d3ff52-4bd2-4fdc-8b55-33dd9c53d8e0", "db_session_id": "1TK25AVRA1WQDS4JHM8T", "orig_file_number": 20, "seqno_to_time_mapping": "N/A"}}
Dec  6 04:45:24 np0005548916 ceph-mon[79770]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 7] Flush lasted 46733 microseconds, and 21090 cpu microseconds.
Dec  6 04:45:24 np0005548916 ceph-mon[79770]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  6 04:45:24 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:45:24.519609) [db/flush_job.cc:967] [default] [JOB 7] Level-0 flush table #20: 6722042 bytes OK
Dec  6 04:45:24 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:45:24.519653) [db/memtable_list.cc:519] [default] Level-0 commit table #20 started
Dec  6 04:45:24 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:45:24.521352) [db/memtable_list.cc:722] [default] Level-0 commit table #20: memtable #1 done
Dec  6 04:45:24 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:45:24.521382) EVENT_LOG_v1 {"time_micros": 1765014324521374, "job": 7, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec  6 04:45:24 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:45:24.521403) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec  6 04:45:24 np0005548916 ceph-mon[79770]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 7] Try to delete WAL files size 10697508, prev total WAL file size 10715538, number of live WAL files 2.
Dec  6 04:45:24 np0005548916 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000016.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 04:45:24 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:45:24.524707) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F7300323531' seq:72057594037927935, type:22 .. '7061786F7300353033' seq:0, type:0; will stop at (end)
Dec  6 04:45:24 np0005548916 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 8] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec  6 04:45:24 np0005548916 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 7 Base level 0, inputs: [20(6564KB)], [18(12MB)]
Dec  6 04:45:24 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014324525081, "job": 8, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [20], "files_L6": [18], "score": -1, "input_data_size": 19465518, "oldest_snapshot_seqno": -1}
Dec  6 04:45:24 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:45:24 np0005548916 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 8] Generated table #21: 4125 keys, 14793424 bytes, temperature: kUnknown
Dec  6 04:45:24 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014324739190, "cf_name": "default", "job": 8, "event": "table_file_creation", "file_number": 21, "file_size": 14793424, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 14759695, "index_size": 22291, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 10373, "raw_key_size": 105206, "raw_average_key_size": 25, "raw_value_size": 14678043, "raw_average_value_size": 3558, "num_data_blocks": 957, "num_entries": 4125, "num_filter_entries": 4125, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765014012, "oldest_key_time": 0, "file_creation_time": 1765014324, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79d3ff52-4bd2-4fdc-8b55-33dd9c53d8e0", "db_session_id": "1TK25AVRA1WQDS4JHM8T", "orig_file_number": 21, "seqno_to_time_mapping": "N/A"}}
Dec  6 04:45:24 np0005548916 ceph-mon[79770]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  6 04:45:24 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:45:24.739571) [db/compaction/compaction_job.cc:1663] [default] [JOB 8] Compacted 1@0 + 1@6 files to L6 => 14793424 bytes
Dec  6 04:45:24 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:45:24.742194) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 90.9 rd, 69.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(6.4, 12.2 +0.0 blob) out(14.1 +0.0 blob), read-write-amplify(5.1) write-amplify(2.2) OK, records in: 4661, records dropped: 536 output_compression: NoCompression
Dec  6 04:45:24 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:45:24.742224) EVENT_LOG_v1 {"time_micros": 1765014324742210, "job": 8, "event": "compaction_finished", "compaction_time_micros": 214203, "compaction_time_cpu_micros": 73553, "output_level": 6, "num_output_files": 1, "total_output_size": 14793424, "num_input_records": 4661, "num_output_records": 4125, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec  6 04:45:24 np0005548916 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000020.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 04:45:24 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014324744087, "job": 8, "event": "table_file_deletion", "file_number": 20}
Dec  6 04:45:24 np0005548916 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000018.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 04:45:24 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014324747192, "job": 8, "event": "table_file_deletion", "file_number": 18}
Dec  6 04:45:24 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:45:24.524316) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 04:45:24 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:45:24.747301) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 04:45:24 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:45:24.747309) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 04:45:24 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:45:24.747311) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 04:45:24 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:45:24.747313) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 04:45:24 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:45:24.747315) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 04:45:24 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:24 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa180003430 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:25 np0005548916 python3.9[94798]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Dec  6 04:45:25 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:25 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa198003ad0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:25 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:45:25 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:45:25 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:45:25 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:45:25.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:45:25 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:25 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b00091b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:26 np0005548916 python3.9[94977]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  6 04:45:26 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:45:26 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:45:26 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:45:26.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:45:26 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:26 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b00091b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:26 np0005548916 python3.9[95062]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec  6 04:45:27 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:27 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa180003430 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:27 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:27 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 04:45:27 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:45:27 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:45:27 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:45:27 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:45:27.983 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:45:27 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:27 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa198003ad0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:28 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:45:28 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:45:28 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:45:28.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:45:28 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:28 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1a0001670 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:29 np0005548916 python3.9[95216]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  6 04:45:29 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:29 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b00091b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:29 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:45:29 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:45:29 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:45:29.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:45:29 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:29 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa180003430 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:30 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:45:30 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:45:30 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:45:30.194 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:45:30 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:30 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 04:45:30 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:30 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 04:45:30 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:30 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 04:45:30 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:30 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa198003ad0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:31 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:31 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1a0001670 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:31 np0005548916 python3.9[95370]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec  6 04:45:31 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:45:31 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:45:31 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:45:31.990 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:45:31 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:31 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b000a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:32 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:45:32 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:45:32 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:45:32.198 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:45:32 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:32 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa180003430 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:32 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:45:32 np0005548916 python3.9[95524]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  6 04:45:33 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:33 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1a0002380 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:33 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:33 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  6 04:45:33 np0005548916 python3.9[95678]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Dec  6 04:45:33 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:45:33 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:45:33 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:45:33.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:45:33 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:33 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa198003ad0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:34 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:45:34 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:45:34 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:45:34.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:45:34 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:34 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b000a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:34 np0005548916 python3.9[95829]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  6 04:45:35 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:35 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa180003430 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:35 np0005548916 python3.9[95987]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  6 04:45:35 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:45:35 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:45:35 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:45:35.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:45:35 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:35 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1a0002380 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:36 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:45:36 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:45:36 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:45:36.204 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:45:36 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/094536 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  6 04:45:36 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:36 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa198003ad0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:37 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:37 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b000a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:37 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:45:38 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:37 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa180003430 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:38 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:45:38 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:45:38 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:45:37.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:45:38 np0005548916 python3.9[96141]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:45:38 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:45:38 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:45:38 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:45:38.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:45:38 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:38 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1a0002380 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:39 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:39 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa198003ad0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:39 np0005548916 python3.9[96429]: ansible-ansible.builtin.file Invoked with mode=0750 path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Dec  6 04:45:40 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:40 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b000a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:40 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:45:40 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 04:45:40 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:45:40.001 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 04:45:40 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/094540 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  6 04:45:40 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:45:40 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:45:40 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:45:40.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:45:40 np0005548916 python3.9[96580]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  6 04:45:40 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:40 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa180003430 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:41 np0005548916 python3.9[96734]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  6 04:45:41 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:41 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1a0002380 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:42 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:42 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa198003ad0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:42 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:45:42 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:45:42 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:45:42.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:45:42 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:45:42 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:45:42 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:45:42.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:45:42 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:42 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b000a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:42 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:45:43 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:43 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa188000f30 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:43 np0005548916 python3.9[96889]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  6 04:45:44 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:44 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1a0003870 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:44 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:45:44 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:45:44 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:45:44.007 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:45:44 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:45:44 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:45:44 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:45:44.219 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:45:44 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:44 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa198003ad0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:45 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:45 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b000a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:46 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:46 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa188001dd0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:46 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:45:46 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:45:46 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:45:46.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:45:46 np0005548916 python3.9[97068]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  6 04:45:46 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:46 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 04:45:46 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:45:46 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:45:46 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:45:46.222 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:45:46 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:46 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1a0003870 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:47 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:47 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa198003ad0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:47 np0005548916 python3.9[97223]: ansible-ansible.builtin.slurp Invoked with path=/var/lib/edpm-config/os-net-config.returncode src=/var/lib/edpm-config/os-net-config.returncode
Dec  6 04:45:47 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:45:48 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:48 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1a0003870 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:48 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:45:48 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:45:48 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:45:48.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:45:48 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:45:48 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:45:48 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:45:48.226 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:45:48 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:48 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa188001dd0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:49 np0005548916 systemd[1]: session-39.scope: Deactivated successfully.
Dec  6 04:45:49 np0005548916 systemd[1]: session-39.scope: Consumed 18.849s CPU time.
Dec  6 04:45:49 np0005548916 systemd-logind[788]: Session 39 logged out. Waiting for processes to exit.
Dec  6 04:45:49 np0005548916 systemd-logind[788]: Removed session 39.
Dec  6 04:45:49 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:49 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 04:45:49 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:49 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 04:45:49 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:49 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b000a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:49 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:49 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 04:45:50 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:50 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa198003ad0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:50 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:45:50 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 04:45:50 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:45:50.019 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 04:45:50 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:45:50 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:45:50 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:45:50.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:45:50 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:50 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1a0004190 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:51 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:51 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa188002690 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:52 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:52 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b000a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:52 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:45:52 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:45:52 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:45:52.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:45:52 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:45:52 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:45:52 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:45:52.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:45:52 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:45:52 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:52 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b000a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:53 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:53 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  6 04:45:53 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:53 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b000a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:54 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:45:54 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:45:54 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:45:54.026 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:45:54 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:54 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b000a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:54 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:45:54 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:45:54 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:45:54.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:45:54 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:54 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b000a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:55 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:55 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b000a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:56 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:45:56 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:45:56 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:45:56.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:45:56 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:56 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b000a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:56 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:45:56 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:45:56 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:45:56.240 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:45:56 np0005548916 systemd[1]: session-19.scope: Deactivated successfully.
Dec  6 04:45:56 np0005548916 systemd[1]: session-19.scope: Consumed 9.994s CPU time.
Dec  6 04:45:56 np0005548916 systemd-logind[788]: Session 19 logged out. Waiting for processes to exit.
Dec  6 04:45:56 np0005548916 systemd-logind[788]: Removed session 19.
Dec  6 04:45:56 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:56 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa198003ad0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:57 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:57 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa188002fb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:57 np0005548916 systemd-logind[788]: New session 40 of user zuul.
Dec  6 04:45:57 np0005548916 systemd[1]: Started Session 40 of User zuul.
Dec  6 04:45:57 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:45:58 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:58 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa18c0014d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:58 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:45:58 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:45:58 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:45:58.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:45:58 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:45:58 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:45:58 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:45:58.244 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:45:58 np0005548916 python3.9[97409]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  6 04:45:58 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/094558 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  6 04:45:58 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:58 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b000a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:59 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:59 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa198003ad0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:59 np0005548916 python3.9[97563]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  6 04:46:00 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:00 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa188002fb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:00 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:46:00 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:46:00 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:46:00.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:46:00 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:46:00 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:46:00 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:46:00.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:46:00 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:00 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa18c0014d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:01 np0005548916 python3.9[97757]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:46:01 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:01 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b000a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:01 np0005548916 systemd[1]: session-40.scope: Deactivated successfully.
Dec  6 04:46:01 np0005548916 systemd[1]: session-40.scope: Consumed 2.527s CPU time.
Dec  6 04:46:01 np0005548916 systemd-logind[788]: Session 40 logged out. Waiting for processes to exit.
Dec  6 04:46:01 np0005548916 systemd-logind[788]: Removed session 40.
Dec  6 04:46:02 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:02 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa198003ad0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:02 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:46:02 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:46:02 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:46:02.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:46:02 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:46:02 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:46:02 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:46:02.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:46:02 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:46:02 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:02 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa188002fb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:03 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:03 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa18c0014d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:04 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:04 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b000a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:04 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:46:04 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:46:04 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:46:04.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:46:04 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:46:04 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:46:04 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:46:04.257 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:46:04 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:04 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b000a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:05 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:05 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1880040b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:06 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:06 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa18c0014d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:06 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:46:06 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:46:06 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:46:06.048 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:46:06 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:46:06 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:46:06 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:46:06.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:46:06 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:06 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa18c0014d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:07 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:07 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1880040b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:07 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:46:08 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:08 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b000a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:08 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:46:08 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:46:08 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:46:08.053 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:46:08 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:46:08 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:46:08 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:46:08.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:46:08 np0005548916 systemd-logind[788]: New session 41 of user zuul.
Dec  6 04:46:08 np0005548916 systemd[1]: Started Session 41 of User zuul.
Dec  6 04:46:08 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:08 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b000a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:09 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:09 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b000a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:10 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:10 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa17c000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:10 np0005548916 python3.9[97967]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  6 04:46:10 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:46:10 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:46:10 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:46:10.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:46:10 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:46:10 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:46:10 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:46:10.267 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:46:10 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:10 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa174000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:11 np0005548916 python3.9[98122]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  6 04:46:11 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:11 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa198003ce0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:12 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:12 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b000a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:12 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:46:12 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:46:12 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:46:12.059 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:46:12 np0005548916 python3.9[98278]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  6 04:46:12 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:46:12 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:46:12 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:46:12.271 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:46:12 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:46:12 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:12 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa17c0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:13 np0005548916 python3.9[98363]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  6 04:46:13 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:13 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1740016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:14 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:14 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa198003ce0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:14 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:46:14 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:46:14 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:46:14.062 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:46:14 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:46:14 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:46:14 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:46:14.274 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:46:14 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:14 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b000a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:15 np0005548916 python3.9[98517]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  6 04:46:15 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:15 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa17c0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:16 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:16 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1740016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:16 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:46:16 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:46:16 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:46:16.066 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:46:16 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:46:16 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:46:16 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:46:16.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:46:16 np0005548916 python3.9[98715]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:46:16 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:16 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa198003ce0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:17 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:17 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b000a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:17 np0005548916 python3.9[98867]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:46:17 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:46:18 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:18 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa17c0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:18 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:46:18 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:46:18 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:46:18.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:46:18 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:46:18 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:46:18 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:46:18.281 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:46:18 np0005548916 python3.9[99032]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:46:18 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:18 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1740016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:19 np0005548916 python3.9[99110]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:46:19 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:19 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa198003ce0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:19 np0005548916 python3.9[99262]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:46:20 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:20 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b000a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:20 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:46:20 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:46:20 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:46:20.072 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:46:20 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:46:20 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:46:20 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:46:20.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:46:20 np0005548916 python3.9[99341]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:46:20 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:20 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa17c002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:21 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:21 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa174002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:21 np0005548916 python3.9[99493]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:46:22 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:22 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa198003ce0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:22 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:46:22 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:46:22 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:46:22.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:46:22 np0005548916 python3.9[99645]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:46:22 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:46:22 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:46:22 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:46:22.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:46:22 np0005548916 python3.9[99798]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:46:22 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:46:22 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:22 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b000a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:23 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:23 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa17c002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:23 np0005548916 python3.9[99950]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:46:24 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:24 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa174002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:24 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:46:24 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:46:24 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:46:24.078 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:46:24 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:46:24 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 04:46:24 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:46:24.291 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 04:46:24 np0005548916 python3.9[100103]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  6 04:46:24 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:24 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa198003ce0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:25 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:25 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b000a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:26 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:26 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa17c002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:26 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:46:26 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:46:26 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:46:26.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:46:26 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:46:26 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:46:26 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:46:26.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:46:26 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  6 04:46:26 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:26 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa174002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:27 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:27 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa198003ce0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:27 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:46:27 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:46:27 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  6 04:46:27 np0005548916 python3.9[100363]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  6 04:46:27 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:46:28 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:28 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b000a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:28 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:46:28 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:46:28 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:46:28.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:46:28 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:46:28 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:46:28 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:46:28.298 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:46:28 np0005548916 python3.9[100518]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  6 04:46:28 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:28 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa17c003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:29 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:29 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa174003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:29 np0005548916 python3.9[100670]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  6 04:46:30 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:30 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa198003ce0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:30 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:46:30 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:46:30 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:46:30.087 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:46:30 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:46:30 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:46:30 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:46:30.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:46:30 np0005548916 python3.9[100822]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:46:30 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:30 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b000a2d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:31 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:31 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa17c003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:31 np0005548916 python3.9[100976]: ansible-service_facts Invoked
Dec  6 04:46:31 np0005548916 network[100993]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec  6 04:46:31 np0005548916 network[100994]: 'network-scripts' will be removed from distribution in near future.
Dec  6 04:46:31 np0005548916 network[100995]: It is advised to switch to 'NetworkManager' instead for network management.
Dec  6 04:46:32 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:32 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa174003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:32 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:46:32 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:46:32 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:46:32.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:46:32 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:46:32 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:46:32 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:46:32.305 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:46:32 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:46:32 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:32 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa198003ce0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:33 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:33 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b000a2f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:34 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:34 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa17c003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:34 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:46:34 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:46:34 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:46:34.094 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:46:34 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:46:34 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:46:34 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:46:34.309 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:46:34 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:34 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa174003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:35 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:35 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa198003ce0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:36 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:36 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b000a310 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:36 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:46:36 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:46:36 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:46:36.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:46:36 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:46:36 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 04:46:36 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:46:36.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 04:46:36 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:36 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b000a310 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:37 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:37 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa174003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:37 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:46:38 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:38 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa198003d00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:38 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:46:38 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:46:38 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:46:38.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:46:38 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:46:38 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:46:38 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:46:38.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:46:38 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:38 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa17c003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:39 np0005548916 python3.9[101451]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  6 04:46:39 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:39 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b000a4b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:40 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:40 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa174003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:40 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:46:40 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:46:40 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:46:40.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:46:40 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:46:40 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:46:40 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:46:40.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:46:40 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:40 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa198003d20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:41 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:46:41 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:46:41 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:41 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1880019e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:42 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:42 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b000a4b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:42 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:46:42 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:46:42 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:46:42.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:46:42 np0005548916 python3.9[101633]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Dec  6 04:46:42 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:46:42 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:46:42 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:46:42.325 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:46:42 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/094642 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  6 04:46:42 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:46:42 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:42 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa174003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:43 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:43 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa198003d20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:43 np0005548916 python3.9[101786]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:46:44 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:44 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1880019e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:44 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:46:44 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:46:44 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:46:44.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:46:44 np0005548916 python3.9[101864]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/chrony.conf _original_basename=chrony.conf.j2 recurse=False state=file path=/etc/chrony.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:46:44 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:46:44 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:46:44 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:46:44.328 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:46:44 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:44 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b000a4b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:45 np0005548916 python3.9[102017]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:46:45 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:45 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa174003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:45 np0005548916 python3.9[102095]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/sysconfig/chronyd _original_basename=chronyd.sysconfig.j2 recurse=False state=file path=/etc/sysconfig/chronyd force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:46:46 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:46 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa198003d20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:46 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:46:46 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:46:46 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:46:46.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:46:46 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:46:46 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:46:46 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:46:46.332 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:46:46 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:46 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1880019e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:47 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:47 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b000a4b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:47 np0005548916 python3.9[102273]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:46:47 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:46:48 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:48 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa174003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:48 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:46:48 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 04:46:48 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:46:48.120 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 04:46:48 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:46:48 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:46:48 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:46:48.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:46:48 np0005548916 systemd[81504]: Created slice User Background Tasks Slice.
Dec  6 04:46:48 np0005548916 systemd[81504]: Starting Cleanup of User's Temporary Files and Directories...
Dec  6 04:46:48 np0005548916 systemd[81504]: Finished Cleanup of User's Temporary Files and Directories.
Dec  6 04:46:48 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:48 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa198003d20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:49 np0005548916 python3.9[102427]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  6 04:46:49 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:49 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1880019e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:50 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:50 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b000a4b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:50 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:46:50 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:46:50 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:46:50.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:46:50 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:46:50 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:46:50 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:46:50.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:46:50 np0005548916 python3.9[102511]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  6 04:46:50 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:50 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa174003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:51 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:51 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa198003d20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:51 np0005548916 systemd[1]: session-41.scope: Deactivated successfully.
Dec  6 04:46:51 np0005548916 systemd[1]: session-41.scope: Consumed 25.288s CPU time.
Dec  6 04:46:51 np0005548916 systemd-logind[788]: Session 41 logged out. Waiting for processes to exit.
Dec  6 04:46:51 np0005548916 systemd-logind[788]: Removed session 41.
Dec  6 04:46:52 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:52 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1880019e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:52 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:46:52 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:46:52 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:46:52.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:46:52 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:46:52 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:46:52 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:46:52.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:46:52 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:46:52 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:52 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b000a4b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:53 np0005548916 kernel: ganesha.nfsd[91724]: segfault at 50 ip 00007fa25de1e32e sp 00007fa22dffa210 error 4 in libntirpc.so.5.8[7fa25de03000+2c000] likely on CPU 3 (core 0, socket 3)
Dec  6 04:46:53 np0005548916 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Dec  6 04:46:53 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:53 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b000a4b0 fd 39 proxy ignored for local
Dec  6 04:46:53 np0005548916 systemd[1]: Started Process Core Dump (PID 102541/UID 0).
Dec  6 04:46:54 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:46:54 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 04:46:54 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:46:54.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 04:46:54 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:46:54 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:46:54 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:46:54.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:46:55 np0005548916 systemd-coredump[102542]: Process 90200 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 41:#012#0  0x00007fa25de1e32e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Dec  6 04:46:55 np0005548916 systemd[1]: systemd-coredump@2-102541-0.service: Deactivated successfully.
Dec  6 04:46:55 np0005548916 systemd[1]: systemd-coredump@2-102541-0.service: Consumed 1.819s CPU time.
Dec  6 04:46:55 np0005548916 podman[102548]: 2025-12-06 09:46:55.469733439 +0000 UTC m=+0.050723075 container died 490bcdc1ddf2a147605f7bef7763287ae9d25da8b09ab41fcfcd1cec65c24755 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  6 04:46:55 np0005548916 systemd[1]: var-lib-containers-storage-overlay-0bbbd98a88994f54839b8379f302a87baf27efd11c17b9c4f84aad6e60a7f0d8-merged.mount: Deactivated successfully.
Dec  6 04:46:55 np0005548916 podman[102548]: 2025-12-06 09:46:55.513934815 +0000 UTC m=+0.094924401 container remove 490bcdc1ddf2a147605f7bef7763287ae9d25da8b09ab41fcfcd1cec65c24755 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.build-date=20250325)
Dec  6 04:46:55 np0005548916 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.0.0.compute-1.djsnbu.service: Main process exited, code=exited, status=139/n/a
Dec  6 04:46:55 np0005548916 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.0.0.compute-1.djsnbu.service: Failed with result 'exit-code'.
Dec  6 04:46:55 np0005548916 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.0.0.compute-1.djsnbu.service: Consumed 2.346s CPU time.
Dec  6 04:46:56 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:46:56 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 04:46:56 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:46:56.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 04:46:56 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:46:56 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:46:56 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:46:56.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:46:57 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:46:58 np0005548916 systemd-logind[788]: New session 42 of user zuul.
Dec  6 04:46:58 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:46:58 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:46:58 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:46:58.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:46:58 np0005548916 systemd[1]: Started Session 42 of User zuul.
Dec  6 04:46:58 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:46:58 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:46:58 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:46:58.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:46:59 np0005548916 python3.9[102751]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:46:59 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/094659 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  6 04:46:59 np0005548916 python3.9[102903]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/ceph-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:47:00 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:47:00 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:47:00 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:47:00.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:47:00 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:47:00 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:47:00 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:47:00.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:47:00 np0005548916 python3.9[102982]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/ceph-networks.yaml _original_basename=firewall.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/ceph-networks.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:47:00 np0005548916 systemd[1]: session-42.scope: Deactivated successfully.
Dec  6 04:47:00 np0005548916 systemd[1]: session-42.scope: Consumed 1.729s CPU time.
Dec  6 04:47:00 np0005548916 systemd-logind[788]: Session 42 logged out. Waiting for processes to exit.
Dec  6 04:47:00 np0005548916 systemd-logind[788]: Removed session 42.
Dec  6 04:47:02 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:47:02 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:47:02 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:47:02.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:47:02 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:47:02 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:47:02 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:47:02.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:47:02 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:47:04 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:47:04 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 04:47:04 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:47:04.147 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 04:47:04 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:47:04 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:47:04 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:47:04.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:47:05 np0005548916 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.0.0.compute-1.djsnbu.service: Scheduled restart job, restart counter is at 3.
Dec  6 04:47:05 np0005548916 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.djsnbu for 5ecd3f74-dade-5fc4-92ce-8950ae424258.
Dec  6 04:47:05 np0005548916 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.0.0.compute-1.djsnbu.service: Consumed 2.346s CPU time.
Dec  6 04:47:05 np0005548916 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.djsnbu for 5ecd3f74-dade-5fc4-92ce-8950ae424258...
Dec  6 04:47:06 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:47:06 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:47:06 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:47:06.151 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:47:06 np0005548916 podman[103083]: 2025-12-06 09:47:06.195367804 +0000 UTC m=+0.047598459 container create 85455d3243db1463a68bc3199c944543828c9d708094c65b0309507d9efc87ab (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_REF=squid, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  6 04:47:06 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cb68c9c4605a82178b215e3dd6a8db8454de491b3741ae6c6873e5884bb45d11/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Dec  6 04:47:06 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cb68c9c4605a82178b215e3dd6a8db8454de491b3741ae6c6873e5884bb45d11/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  6 04:47:06 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cb68c9c4605a82178b215e3dd6a8db8454de491b3741ae6c6873e5884bb45d11/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Dec  6 04:47:06 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cb68c9c4605a82178b215e3dd6a8db8454de491b3741ae6c6873e5884bb45d11/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.djsnbu-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Dec  6 04:47:06 np0005548916 podman[103083]: 2025-12-06 09:47:06.259932786 +0000 UTC m=+0.112163471 container init 85455d3243db1463a68bc3199c944543828c9d708094c65b0309507d9efc87ab (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  6 04:47:06 np0005548916 podman[103083]: 2025-12-06 09:47:06.266954177 +0000 UTC m=+0.119184832 container start 85455d3243db1463a68bc3199c944543828c9d708094c65b0309507d9efc87ab (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, OSD_FLAVOR=default, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.label-schema.license=GPLv2)
Dec  6 04:47:06 np0005548916 podman[103083]: 2025-12-06 09:47:06.173328058 +0000 UTC m=+0.025558733 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  6 04:47:06 np0005548916 bash[103083]: 85455d3243db1463a68bc3199c944543828c9d708094c65b0309507d9efc87ab
Dec  6 04:47:06 np0005548916 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.djsnbu for 5ecd3f74-dade-5fc4-92ce-8950ae424258.
Dec  6 04:47:06 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:06 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Dec  6 04:47:06 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:06 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Dec  6 04:47:06 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:47:06 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 04:47:06 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:47:06.370 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 04:47:06 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:06 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Dec  6 04:47:06 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:06 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Dec  6 04:47:06 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:06 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Dec  6 04:47:06 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:06 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Dec  6 04:47:06 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:06 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Dec  6 04:47:06 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:06 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 04:47:07 np0005548916 systemd-logind[788]: New session 43 of user zuul.
Dec  6 04:47:07 np0005548916 systemd[1]: Started Session 43 of User zuul.
Dec  6 04:47:07 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:47:08 np0005548916 python3.9[103294]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  6 04:47:08 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:47:08 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 04:47:08 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:47:08.154 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 04:47:08 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:47:08 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 04:47:08 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:47:08.373 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 04:47:09 np0005548916 python3.9[103451]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:47:10 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:47:10 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 04:47:10 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:47:10.157 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 04:47:10 np0005548916 python3.9[103626]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:47:10 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:47:10 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:47:10 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:47:10.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:47:10 np0005548916 python3.9[103705]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.9xj64iw3 recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:47:11 np0005548916 python3.9[103857]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:47:12 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:47:12 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:47:12 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:47:12.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:47:12 np0005548916 python3.9[103935]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/sysconfig/podman_drop_in _original_basename=.xqifjiqg recurse=False state=file path=/etc/sysconfig/podman_drop_in force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:47:12 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:47:12 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:47:12 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:47:12.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:47:12 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:12 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[main] rados_kv_traverse :CLIENT ID :EVENT :Failed to lst kv ret=-2
Dec  6 04:47:12 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:12 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[main] rados_cluster_read_clids :CLIENT ID :EVENT :Failed to traverse recovery db: -2
Dec  6 04:47:12 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:12 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 04:47:12 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:12 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 04:47:12 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:12 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 04:47:12 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:12 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 04:47:12 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:12 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 04:47:12 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:12 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 04:47:12 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:12 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 04:47:12 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:12 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 04:47:12 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:12 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 04:47:12 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:12 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 04:47:12 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:47:13 np0005548916 python3.9[104088]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:47:13 np0005548916 python3.9[104240]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:47:14 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:47:14 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 04:47:14 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:47:14.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 04:47:14 np0005548916 python3.9[104318]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:47:14 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:47:14 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:47:14 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:47:14.383 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:47:14 np0005548916 python3.9[104471]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:47:14 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/094714 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 7ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  6 04:47:15 np0005548916 python3.9[104549]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:47:16 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:47:16 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:47:16 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:47:16.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:47:16 np0005548916 python3.9[104701]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:47:16 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:47:16 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:47:16 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:47:16.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:47:17 np0005548916 python3.9[104854]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:47:17 np0005548916 python3.9[104932]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:47:17 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:47:18 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:47:18 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.002000049s ======
Dec  6 04:47:18 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:47:18.172 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000049s
Dec  6 04:47:18 np0005548916 python3.9[105084]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:47:18 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:47:18 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:47:18 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:47:18.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:47:18 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:18 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[main] rados_cluster_end_grace :CLIENT ID :EVENT :Failed to remove rec-0000000000000009:nfs.cephfs.0: -2
Dec  6 04:47:18 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:18 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  6 04:47:18 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:18 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Dec  6 04:47:18 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:18 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Dec  6 04:47:18 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:18 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Dec  6 04:47:18 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:18 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Dec  6 04:47:18 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:18 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Dec  6 04:47:18 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:18 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Dec  6 04:47:18 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:18 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  6 04:47:18 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:18 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  6 04:47:18 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:18 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  6 04:47:18 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:18 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Dec  6 04:47:18 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:18 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  6 04:47:18 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:18 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Dec  6 04:47:18 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:18 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Dec  6 04:47:18 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:18 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Dec  6 04:47:18 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:18 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Dec  6 04:47:18 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:18 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Dec  6 04:47:18 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:18 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Dec  6 04:47:18 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:18 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Dec  6 04:47:18 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:18 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Dec  6 04:47:18 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:18 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Dec  6 04:47:18 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:18 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Dec  6 04:47:18 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:18 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Dec  6 04:47:18 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:18 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Dec  6 04:47:18 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:18 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec  6 04:47:18 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:18 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Dec  6 04:47:18 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:18 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec  6 04:47:18 np0005548916 python3.9[105176]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:47:19 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:19 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c4c000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:19 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:19 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c340016e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:20 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:20 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c20000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:20 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:47:20 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 04:47:20 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:47:20.176 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 04:47:20 np0005548916 python3.9[105331]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  6 04:47:20 np0005548916 systemd[1]: Reloading.
Dec  6 04:47:20 np0005548916 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:47:20 np0005548916 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:47:20 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:47:20 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:47:20 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:47:20.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:47:21 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:21 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c1c000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:21 np0005548916 python3.9[105522]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:47:21 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/094721 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  6 04:47:21 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:21 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c4c001bd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:21 np0005548916 python3.9[105600]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:47:22 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:22 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c34002000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:22 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:47:22 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 04:47:22 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:47:22.180 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 04:47:22 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:47:22 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:47:22 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:47:22.398 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:47:22 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:47:23 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:23 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c200016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:23 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:23 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c1c0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:23 np0005548916 python3.9[105753]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:47:24 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:24 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c4c001bd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:24 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:47:24 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:47:24 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:47:24.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:47:24 np0005548916 python3.9[105831]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:47:24 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:47:24 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:47:24 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:47:24.403 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:47:25 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:25 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c34002000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:25 np0005548916 python3.9[105984]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  6 04:47:25 np0005548916 systemd[1]: Reloading.
Dec  6 04:47:25 np0005548916 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:47:25 np0005548916 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:47:25 np0005548916 systemd[1]: Starting Create netns directory...
Dec  6 04:47:25 np0005548916 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec  6 04:47:25 np0005548916 systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec  6 04:47:25 np0005548916 systemd[1]: Finished Create netns directory.
Dec  6 04:47:25 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:25 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c200016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:26 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:26 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c1c0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:26 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:47:26 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 04:47:26 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:47:26.189 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 04:47:26 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:47:26 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:47:26 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:47:26.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:47:26 np0005548916 python3.9[106201]: ansible-ansible.builtin.service_facts Invoked
Dec  6 04:47:26 np0005548916 network[106218]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec  6 04:47:26 np0005548916 network[106219]: 'network-scripts' will be removed from distribution in near future.
Dec  6 04:47:26 np0005548916 network[106220]: It is advised to switch to 'NetworkManager' instead for network management.
Dec  6 04:47:27 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:27 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c4c0089d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:27 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:27 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c34002000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:27 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:47:28 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:28 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c200016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:28 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:47:28 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:47:28 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:47:28.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:47:28 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:47:28 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 04:47:28 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:47:28.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 04:47:29 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:29 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c1c0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:29 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:29 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c4c0089d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:30 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:30 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c34002000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:30 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:47:30 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:47:30 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:47:30.197 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:47:30 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:47:30 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:47:30 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:47:30.415 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:47:31 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:31 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c20002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:31 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:31 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c1c002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:31 np0005548916 python3.9[106484]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:47:31 np0005548916 python3.9[106562]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/etc/ssh/sshd_config _original_basename=sshd_config_block.j2 recurse=False state=file path=/etc/ssh/sshd_config force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:47:32 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:32 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c4c0096e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:32 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:47:32 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:47:32 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:47:32.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:47:32 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:47:32 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 04:47:32 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:47:32.417 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 04:47:32 np0005548916 python3.9[106715]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:47:32 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:47:32 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/094732 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  6 04:47:33 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:33 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c34002000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:33 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:33 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c20002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:33 np0005548916 python3.9[106867]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:47:33 np0005548916 python3.9[106945]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/var/lib/edpm-config/firewall/sshd-networks.yaml _original_basename=firewall.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/sshd-networks.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:47:34 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:34 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c1c002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:34 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:47:34 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 04:47:34 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:47:34.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 04:47:34 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:47:34 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 04:47:34 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:47:34.421 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 04:47:35 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:35 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c4c0096e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:35 np0005548916 python3.9[107098]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Dec  6 04:47:35 np0005548916 systemd[1]: Starting Time & Date Service...
Dec  6 04:47:35 np0005548916 systemd[1]: Started Time & Date Service.
Dec  6 04:47:35 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:35 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c34002000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:36 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:36 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c20002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:36 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:47:36 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:47:36 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:47:36.206 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:47:36 np0005548916 python3.9[107254]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:47:36 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:47:36 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:47:36 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:47:36.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:47:36 np0005548916 python3.9[107407]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:47:37 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:37 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c20002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:37 np0005548916 python3.9[107485]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:47:37 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:37 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c4c0096e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:37 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:47:38 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:38 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c34002000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:38 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:47:38 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:47:38 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:47:38.209 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:47:38 np0005548916 python3.9[107637]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:47:38 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:47:38 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 04:47:38 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:47:38.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 04:47:38 np0005548916 python3.9[107716]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.zdxwm14a recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:47:39 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:39 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c20002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:39 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:39 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c20002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:39 np0005548916 python3.9[107868]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:47:40 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:40 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c4c00a3f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:40 np0005548916 python3.9[107946]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:47:40 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:47:40 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:47:40 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:47:40.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:47:40 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:47:40 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 04:47:40 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:47:40.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 04:47:41 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:41 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c34002000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:41 np0005548916 python3.9[108170]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:47:41 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:41 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c20002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:41 np0005548916 python3[108334]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Dec  6 04:47:42 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:42 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c1c003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:42 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  6 04:47:42 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:47:42 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:47:42 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  6 04:47:42 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:42 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 04:47:42 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:47:42 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:47:42 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:47:42.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:47:42 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:47:42 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:47:42 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:47:42.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:47:42 np0005548916 python3.9[108487]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:47:42 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:47:43 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:43 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c4c00a3f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:43 np0005548916 python3.9[108565]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:47:43 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:43 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c34003cd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:44 np0005548916 python3.9[108717]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:47:44 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:44 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c20003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:44 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:47:44 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:47:44 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:47:44.218 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:47:44 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:47:44 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:47:44 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:47:44.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:47:44 np0005548916 python3.9[108796]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:47:45 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:45 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c20003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:45 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:45 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 04:47:45 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:45 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 04:47:45 np0005548916 python3.9[108948]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:47:45 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:45 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c4c00a3f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:46 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:46 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c34003cd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:46 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:46 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 04:47:46 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:47:46 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:47:46 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:47:46.223 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:47:46 np0005548916 python3.9[109026]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:47:46 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:47:46 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 04:47:46 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:47:46.441 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 04:47:47 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:47 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c20003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:47 np0005548916 python3.9[109204]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:47:47 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:47 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c20003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:47 np0005548916 python3.9[109282]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:47:47 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:47:48 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:48 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c4c00a3f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:48 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:47:48 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:47:48 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:47:48.226 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:47:48 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:47:48 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:47:48 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:47:48.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:47:48 np0005548916 python3.9[109435]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:47:49 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:49 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c34003cd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:49 np0005548916 python3.9[109513]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-rules.nft _original_basename=ruleset.j2 recurse=False state=file path=/etc/nftables/edpm-rules.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:47:49 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:49 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  6 04:47:49 np0005548916 ceph-mon[79770]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #22. Immutable memtables: 0.
Dec  6 04:47:49 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:47:49.378767) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec  6 04:47:49 np0005548916 ceph-mon[79770]: rocksdb: [db/flush_job.cc:856] [default] [JOB 9] Flushing memtable with next log file: 22
Dec  6 04:47:49 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014469379053, "job": 9, "event": "flush_started", "num_memtables": 1, "num_entries": 1506, "num_deletes": 250, "total_data_size": 3823377, "memory_usage": 3879432, "flush_reason": "Manual Compaction"}
Dec  6 04:47:49 np0005548916 ceph-mon[79770]: rocksdb: [db/flush_job.cc:885] [default] [JOB 9] Level-0 flush table #23: started
Dec  6 04:47:49 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014469404610, "cf_name": "default", "job": 9, "event": "table_file_creation", "file_number": 23, "file_size": 1471945, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 10666, "largest_seqno": 12167, "table_properties": {"data_size": 1467153, "index_size": 2188, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1541, "raw_key_size": 12146, "raw_average_key_size": 20, "raw_value_size": 1456834, "raw_average_value_size": 2407, "num_data_blocks": 97, "num_entries": 605, "num_filter_entries": 605, "num_deletions": 250, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765014324, "oldest_key_time": 1765014324, "file_creation_time": 1765014469, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79d3ff52-4bd2-4fdc-8b55-33dd9c53d8e0", "db_session_id": "1TK25AVRA1WQDS4JHM8T", "orig_file_number": 23, "seqno_to_time_mapping": "N/A"}}
Dec  6 04:47:49 np0005548916 ceph-mon[79770]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 9] Flush lasted 25916 microseconds, and 9121 cpu microseconds.
Dec  6 04:47:49 np0005548916 ceph-mon[79770]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  6 04:47:49 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:47:49.404756) [db/flush_job.cc:967] [default] [JOB 9] Level-0 flush table #23: 1471945 bytes OK
Dec  6 04:47:49 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:47:49.404811) [db/memtable_list.cc:519] [default] Level-0 commit table #23 started
Dec  6 04:47:49 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:47:49.417118) [db/memtable_list.cc:722] [default] Level-0 commit table #23: memtable #1 done
Dec  6 04:47:49 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:47:49.417224) EVENT_LOG_v1 {"time_micros": 1765014469417210, "job": 9, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec  6 04:47:49 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:47:49.417264) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec  6 04:47:49 np0005548916 ceph-mon[79770]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 9] Try to delete WAL files size 3816377, prev total WAL file size 3816377, number of live WAL files 2.
Dec  6 04:47:49 np0005548916 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000019.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 04:47:49 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:47:49.419252) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740030' seq:72057594037927935, type:22 .. '6D67727374617400323531' seq:0, type:0; will stop at (end)
Dec  6 04:47:49 np0005548916 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 10] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec  6 04:47:49 np0005548916 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 9 Base level 0, inputs: [23(1437KB)], [21(14MB)]
Dec  6 04:47:49 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014469419509, "job": 10, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [23], "files_L6": [21], "score": -1, "input_data_size": 16265369, "oldest_snapshot_seqno": -1}
Dec  6 04:47:49 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:49 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c1c004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:49 np0005548916 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 10] Generated table #24: 4281 keys, 14215931 bytes, temperature: kUnknown
Dec  6 04:47:49 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014469618650, "cf_name": "default", "job": 10, "event": "table_file_creation", "file_number": 24, "file_size": 14215931, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 14183043, "index_size": 21066, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 10757, "raw_key_size": 108825, "raw_average_key_size": 25, "raw_value_size": 14100564, "raw_average_value_size": 3293, "num_data_blocks": 902, "num_entries": 4281, "num_filter_entries": 4281, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765014012, "oldest_key_time": 0, "file_creation_time": 1765014469, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79d3ff52-4bd2-4fdc-8b55-33dd9c53d8e0", "db_session_id": "1TK25AVRA1WQDS4JHM8T", "orig_file_number": 24, "seqno_to_time_mapping": "N/A"}}
Dec  6 04:47:49 np0005548916 ceph-mon[79770]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  6 04:47:49 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:47:49.619084) [db/compaction/compaction_job.cc:1663] [default] [JOB 10] Compacted 1@0 + 1@6 files to L6 => 14215931 bytes
Dec  6 04:47:49 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:47:49.621346) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 81.6 rd, 71.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.4, 14.1 +0.0 blob) out(13.6 +0.0 blob), read-write-amplify(20.7) write-amplify(9.7) OK, records in: 4730, records dropped: 449 output_compression: NoCompression
Dec  6 04:47:49 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:47:49.621378) EVENT_LOG_v1 {"time_micros": 1765014469621361, "job": 10, "event": "compaction_finished", "compaction_time_micros": 199280, "compaction_time_cpu_micros": 59394, "output_level": 6, "num_output_files": 1, "total_output_size": 14215931, "num_input_records": 4730, "num_output_records": 4281, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec  6 04:47:49 np0005548916 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000023.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 04:47:49 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014469621810, "job": 10, "event": "table_file_deletion", "file_number": 23}
Dec  6 04:47:49 np0005548916 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000021.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 04:47:49 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014469624307, "job": 10, "event": "table_file_deletion", "file_number": 21}
Dec  6 04:47:49 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:47:49.418880) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 04:47:49 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:47:49.624355) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 04:47:49 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:47:49.624361) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 04:47:49 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:47:49.624362) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 04:47:49 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:47:49.624364) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 04:47:49 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:47:49.624365) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 04:47:49 np0005548916 python3.9[109665]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:47:50 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:50 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c20003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:50 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:47:50 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:47:50 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:47:50.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:47:50 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:47:50 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:47:50 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:47:50.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:47:50 np0005548916 python3.9[109823]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:47:51 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:51 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c4c00a3f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:51 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:51 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c38001230 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:51 np0005548916 python3.9[109975]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:47:52 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:52 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c1c004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:52 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:47:52 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:47:52 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:47:52.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:47:52 np0005548916 python3.9[110153]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:47:52 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:47:52 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:47:52 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:47:52.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:47:52 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:47:52 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:47:52 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:47:53 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:53 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c20003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:53 np0005548916 python3.9[110305]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Dec  6 04:47:53 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:53 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c4c00a3f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:54 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:54 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c38001d30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:54 np0005548916 python3.9[110457]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Dec  6 04:47:54 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:47:54 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 04:47:54 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:47:54.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 04:47:54 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:47:54 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:47:54 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:47:54.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:47:54 np0005548916 systemd[1]: session-43.scope: Deactivated successfully.
Dec  6 04:47:54 np0005548916 systemd[1]: session-43.scope: Consumed 31.935s CPU time.
Dec  6 04:47:54 np0005548916 systemd-logind[788]: Session 43 logged out. Waiting for processes to exit.
Dec  6 04:47:54 np0005548916 systemd-logind[788]: Removed session 43.
Dec  6 04:47:55 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/094755 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  6 04:47:55 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:55 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c1c004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:55 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:55 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c20003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:56 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:56 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c4c00a3f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:56 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:47:56 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:47:56 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:47:56.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:47:56 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:47:56 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:47:56 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:47:56.460 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:47:57 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:57 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c4c00a3f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:57 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:57 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c38001d30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:57 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:47:58 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:58 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c20003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:58 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:47:58 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:47:58 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:47:58.244 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:47:58 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:47:58 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 04:47:58 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:47:58.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 04:47:59 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:59 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c1c004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:59 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:59 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c4c00a3f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:00 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:00 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c38002a40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:00 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:48:00 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 04:48:00 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:48:00.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 04:48:00 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:48:00 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 04:48:00 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:48:00.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 04:48:00 np0005548916 systemd-logind[788]: New session 44 of user zuul.
Dec  6 04:48:00 np0005548916 systemd[1]: Started Session 44 of User zuul.
Dec  6 04:48:01 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:01 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c20003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:01 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:01 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c1c004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:01 np0005548916 python3.9[110641]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Dec  6 04:48:02 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:02 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c4c00a3f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:02 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:48:02 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 04:48:02 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:48:02.251 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 04:48:02 np0005548916 python3.9[110794]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  6 04:48:02 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:48:02 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 04:48:02 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:48:02.471 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 04:48:02 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:48:03 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:03 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c38002a40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:03 np0005548916 python3.9[110948]: ansible-ansible.builtin.slurp Invoked with src=/etc/ssh/ssh_known_hosts
Dec  6 04:48:03 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:03 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c20003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:03 np0005548916 python3.9[111100]: ansible-ansible.legacy.stat Invoked with path=/tmp/ansible.ua2f2h2u follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:48:04 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:04 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c1c004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:04 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:48:04 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:48:04 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:48:04.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:48:04 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:48:04 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:48:04 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:48:04.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:48:04 np0005548916 python3.9[111226]: ansible-ansible.legacy.copy Invoked with dest=/tmp/ansible.ua2f2h2u mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765014483.5146506-103-145943217284033/.source.ua2f2h2u _original_basename=.uocet0ja follow=False checksum=741dc69011fb61b699872c865e152b9968457717 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:48:05 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:05 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c4c00a3f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:05 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:05 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c38002a40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:05 np0005548916 systemd[1]: systemd-timedated.service: Deactivated successfully.
Dec  6 04:48:05 np0005548916 python3.9[111378]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  6 04:48:06 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:06 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c38002a40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:06 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:48:06 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 04:48:06 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:48:06.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 04:48:06 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:48:06 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 04:48:06 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:48:06.479 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 04:48:06 np0005548916 python3.9[111558]: ansible-ansible.builtin.blockinfile Invoked with block=compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDneZurSARwLaZA1xEymzXlvVAPvP8u0PCrqXuMYD5ewImDDChRITnk4XHKT/DUfrSJf9/7oJsddEbLRjhCtedqrMZsCkWz1BxtCmPBuvz2LfFhEn27TjqYLctOVGigQGsj6ILvPOzzLiapd93yApWDmH6P0un/ltmdM0iZLygNpzG3HLF8STBXzlo/8slci69Em7XppcrOpl1TS7DaVlpNcRQvo9pFuIrbMD9g0DOdMwk5YCH6g7OzGWqq0gt0YUOztmsqxWHKav3E0SXAD/vkgRc/1ZCNGFNSvf0dIgimCF3xlNWrppnvNgQ1BRqiQ7RArlOp1bVg0Ugdce6f4TIrq36Ois2U5+/myF5WQ7l9hRMRvoP64hSSsRAIDobTI/zMStUP3iZPFngxDxwQtpydHfFGywBL9811c42U7JsGxE8890uOIDk/oOkyhSH6KHQCPFjmKBJ98nT01lgnXyFSNOqds6QOYBasUWNFWd2wS7YpTheGlVVM8bk/gB4K2L0=#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOMkn8zp09tRuEaH/bUoP0rYj+dziM1KcqMKxOgM9K1U#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBCrMdvJJYP0cflC7RDFsxwr66nSp9R7QU726CAfJcKLw6vHh8Z9Lw5wLH0kiaSpsb6SAPffloplHEDiwTOkghOc=#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDAiB67qk/R3IfGpcAH1Ojopc8KX94De+Kxs31cKQLD04X+4QRXPRdMxU85LOhN58eKoHaBi8cgqk7+dvRypGD5vbtbRN9r0VN7tGwiSQTlVFbEuhn0AEbnRwNAMWEEMHO9kEjufP4N2zEEhtQBXy9oO2tMX3+BX4Z3YZZMQyZUgohdBHp2VCul9VdRuo0oHSr8HHm0nN61dMjalnThmgkGAu5hG8qhkWT4i9hroSKBsR5kVBUFTqdXekYkVy4YIYfM2lBXiMOFHtvr1a+KOyIfgWMb7GBPW7oKqtzCfVgSbGaUhSvGzs1OWt3U/PjjapIlmDnwD5ukzVxWV5ldh0vA48tXh5R1wqAoN5/Y/RiAKaY2kd/fvtkhvVDGZluXOz5jJ02IFHm+v4dP3Ig8YOuS5BEkWFuJHkblW0t/+4siTHWwmGEuvUI6y8Gb2pGcBKsWCJtLePYzT09IAmrjwO0jAgbWy0nvCZ+SKlbBBrXP6OgNgMkA+GH9iGOl6FOuRok=#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGYNj3LmNvR0emoQHuuy9NKXPivs/dznunVy8GExnJl8#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJhKmGSvg8FMw16qKPzk6Pyj+OHkN3bmk20mts1PdCRcNRnn9sT1DgI6U8Aze1tjGPujT4eDL+Y9r/hsrfM4qDc=#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCtvqYC0W0zPSX/plyJvm0q1VGDScYTNlcCdllukOe81JRfU3GhVusPZOX0xRSaLP/lmXtfqWcbBRCkLsmFrAo2EHn1CMqMr5WkhY4+rgApF+MGLDOUo57tlKZLPIwdL0SSY/Qv8lBfrqr7LUDZ7fTTTbqTzim/bncxg/u0KxSWBdvjfmYi13SwO65wDkFqSVYa3h8DNij6cRRjQ0fJuJ9Da860hmMnqo9GJMU6dq3zMXXn3YfuF4E4M0UQdlWmVW4EwBTzsfA1XYbSpW7VdRJw6esB4vZ9/Succj+XZiANoDqL9gXSEjNXVVWVbL/7aGJJF9LLQ3VVxmHdbYs1NcTI6Yy9d61zDJHnK/nlYHMhmAHxiDsZEpv0xF72LLzaI86xxvnbx4eUpnyW6LnKiUCYUAUrWIMpLiIbWUxeIoYmj9rqLhwlo5kCy7WdCYYEMTtGI53oIyU0EbXf/r4WAuzmqpVRPyc2Sd5tYD4aXh1JZLUcZy+NLR0Y4SA8RflKFcs=#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFDJYF6pUvFgGUbY2QEOHAq7ZEhRQJUqPTVPOuTyb476#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPJ19afQPeSMtr3O9L1fe5+bNzTAsOOCA5fLihUdryDYc29KKD+0XABHKIvqeefcCsIBjZRA//9OzCUftfvXK9A=#012 create=True mode=0644 path=/tmp/ansible.ua2f2h2u state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:48:07 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:07 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c1c004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:07 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:07 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c4c00a3f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:07 np0005548916 python3.9[111710]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.ua2f2h2u' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:48:07 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:48:08 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:08 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c38002a40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:08 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:48:08 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:48:08 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:48:08.264 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:48:08 np0005548916 python3.9[111865]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.ua2f2h2u state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:48:08 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:48:08 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 04:48:08 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:48:08.482 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 04:48:08 np0005548916 systemd-logind[788]: Session 44 logged out. Waiting for processes to exit.
Dec  6 04:48:08 np0005548916 systemd[1]: session-44.scope: Deactivated successfully.
Dec  6 04:48:08 np0005548916 systemd[1]: session-44.scope: Consumed 5.280s CPU time.
Dec  6 04:48:08 np0005548916 systemd-logind[788]: Removed session 44.
Dec  6 04:48:09 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:09 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c38002a40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:09 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:09 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c1c004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:10 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:10 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c4c00a3f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:10 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:48:10 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:48:10 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:48:10.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:48:10 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:48:10 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:48:10 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:48:10.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:48:11 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:11 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c38002a40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:11 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:11 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c20003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:12 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:12 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c1c004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:12 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:48:12 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:48:12 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:48:12.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:48:12 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:48:12 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 04:48:12 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:48:12.488 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 04:48:12 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:48:13 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:13 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c4c00a3f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:13 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:13 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c38002a40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:14 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:14 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c20003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:14 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:48:14 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:48:14 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:48:14.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:48:14 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:48:14 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:48:14 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:48:14.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:48:15 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:15 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c1c004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:15 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:15 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c4c00a3f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:16 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:16 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c38003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:16 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:48:16 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:48:16 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:48:16.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:48:16 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:48:16 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:48:16 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:48:16.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:48:17 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:17 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c20003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:17 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:17 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c1c004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:17 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:48:18 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:18 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c4c00a3f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:18 np0005548916 systemd-logind[788]: New session 45 of user zuul.
Dec  6 04:48:18 np0005548916 systemd[1]: Started Session 45 of User zuul.
Dec  6 04:48:18 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:48:18 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:48:18 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:48:18.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:48:18 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:48:18 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:48:18 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:48:18.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:48:19 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:19 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c38003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:19 np0005548916 python3.9[112048]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  6 04:48:19 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:19 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c20003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:20 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:20 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c1c004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:20 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:48:20 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:48:20 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:48:20.287 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:48:20 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:48:20 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:48:20 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:48:20.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:48:20 np0005548916 python3.9[112205]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Dec  6 04:48:21 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:21 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c4c00a3f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:21 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:21 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c38003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:21 np0005548916 python3.9[112360]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  6 04:48:22 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:22 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c340008d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:22 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:48:22 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 04:48:22 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:48:22.289 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 04:48:22 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:48:22 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 04:48:22 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:48:22.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 04:48:22 np0005548916 python3.9[112515]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:48:22 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:48:23 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:23 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c440013a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:23 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:23 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c440013a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:23 np0005548916 python3.9[112669]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  6 04:48:24 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:24 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c1c004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:24 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:48:24 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 04:48:24 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:48:24.292 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 04:48:24 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:48:24 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:48:24 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:48:24.509 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:48:24 np0005548916 python3.9[112822]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:48:24 np0005548916 systemd[1]: session-45.scope: Deactivated successfully.
Dec  6 04:48:24 np0005548916 systemd[1]: session-45.scope: Consumed 3.921s CPU time.
Dec  6 04:48:24 np0005548916 systemd-logind[788]: Session 45 logged out. Waiting for processes to exit.
Dec  6 04:48:24 np0005548916 systemd-logind[788]: Removed session 45.
Dec  6 04:48:25 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:25 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c14000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:25 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:25 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c38003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:26 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:26 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c44002350 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:26 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:48:26 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:48:26 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:48:26.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:48:26 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:48:26 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:48:26 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:48:26.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:48:27 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:27 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c1c004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:27 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:27 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c140016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:27 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:48:28 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:28 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c38003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:28 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:48:28 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:48:28 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:48:28.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:48:28 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:48:28 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:48:28 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:48:28.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:48:29 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:29 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c38003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:29 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:29 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c38003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:30 np0005548916 systemd-logind[788]: New session 46 of user zuul.
Dec  6 04:48:30 np0005548916 systemd[1]: Started Session 46 of User zuul.
Dec  6 04:48:30 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:30 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c140016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:30 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:48:30 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:48:30 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:48:30.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:48:30 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:48:30 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 04:48:30 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:48:30.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 04:48:31 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:31 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c140016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:31 np0005548916 python3.9[113030]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  6 04:48:31 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:31 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c140016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:32 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:32 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c38003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:32 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:48:32 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:48:32 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:48:32.305 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:48:32 np0005548916 python3.9[113186]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  6 04:48:32 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:48:32 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:48:32 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:48:32.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:48:32 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:48:33 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:33 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c1c004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:33 np0005548916 python3.9[113271]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec  6 04:48:33 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:33 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c44002350 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:34 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:34 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c140016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:34 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:48:34 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 04:48:34 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:48:34.309 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 04:48:34 np0005548916 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec  6 04:48:34 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:48:34 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:48:34 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:48:34.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:48:35 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:35 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c38003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:35 np0005548916 python3.9[113424]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:48:35 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:35 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c1c004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:36 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:36 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c44003340 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:36 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:48:36 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:48:36 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:48:36.312 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:48:36 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:48:36 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:48:36 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:48:36.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:48:36 np0005548916 python3.9[113576]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec  6 04:48:37 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:37 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c140032f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:37 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:37 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c38003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:37 np0005548916 python3.9[113726]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  6 04:48:37 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:48:38 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:38 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c1c004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:38 np0005548916 python3.9[113876]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/config follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  6 04:48:38 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:48:38 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:48:38 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:48:38.315 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:48:38 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:48:38 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:48:38 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:48:38.536 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:48:38 np0005548916 systemd[1]: session-46.scope: Deactivated successfully.
Dec  6 04:48:38 np0005548916 systemd[1]: session-46.scope: Consumed 6.154s CPU time.
Dec  6 04:48:38 np0005548916 systemd-logind[788]: Session 46 logged out. Waiting for processes to exit.
Dec  6 04:48:38 np0005548916 systemd-logind[788]: Removed session 46.
Dec  6 04:48:39 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:39 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c44003340 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:39 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:39 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c140032f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:40 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:40 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c38003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:40 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:48:40 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:48:40 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:48:40.318 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:48:40 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:48:40 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:48:40 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:48:40.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:48:41 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:41 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c1c004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:41 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:41 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c44003340 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:42 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:42 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c14003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:42 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:48:42 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:48:42 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:48:42.322 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:48:42 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:48:42 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:48:42 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:48:42.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:48:42 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:48:43 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:43 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c38003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:43 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:43 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c1c004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:44 np0005548916 systemd-logind[788]: New session 47 of user zuul.
Dec  6 04:48:44 np0005548916 systemd[1]: Started Session 47 of User zuul.
Dec  6 04:48:44 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:44 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c440047d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:44 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:48:44 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:48:44 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:48:44.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:48:44 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:48:44 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:48:44 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:48:44.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:48:45 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:45 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c14003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:45 np0005548916 python3.9[114058]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  6 04:48:45 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:45 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c38003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:46 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:46 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c1c004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:46 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:48:46 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:48:46 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:48:46.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:48:46 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:48:46 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:48:46 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:48:46.551 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:48:46 np0005548916 python3.9[114240]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:48:47 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:47 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c440047d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:47 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:47 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c440047d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:47 np0005548916 python3.9[114392]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:48:47 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:48:48 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:48 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c38003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:48 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:48:48 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:48:48 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:48:48.334 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:48:48 np0005548916 python3.9[114544]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:48:48 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:48:48 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:48:48 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:48:48.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:48:49 np0005548916 python3.9[114668]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014527.7947607-158-104226358948331/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=ce0f53450193d0e253b88fe8ddc0e5fff4cd2fd2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:48:49 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:49 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c38003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:49 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:49 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c14003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:49 np0005548916 python3.9[114820]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:48:50 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:50 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c440047d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:50 np0005548916 python3.9[114943]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014529.2583728-158-63358720721383/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=f805cc6455e59702aa77bd6ffe81bb9b155b0be7 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:48:50 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:48:50 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:48:50 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:48:50.337 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:48:50 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:48:50 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:48:50 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:48:50.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:48:50 np0005548916 python3.9[115096]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:48:51 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:51 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c440047d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:51 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:51 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c38003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:51 np0005548916 python3.9[115219]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014530.4886403-158-73806083441357/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=cca74c031dd75057ea2d8bce881d587d45d382dd backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:48:52 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:52 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c14003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:52 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:48:52 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:48:52 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:48:52.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:48:52 np0005548916 python3.9[115436]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:48:52 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:48:52 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:48:52 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:48:52.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:48:52 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:48:53 np0005548916 python3.9[115660]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:48:53 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:53 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c440047d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:53 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:48:53 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:48:53 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:48:53 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:48:53 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  6 04:48:53 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:48:53 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:48:53 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  6 04:48:53 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:53 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c1c004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:53 np0005548916 python3.9[115828]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:48:54 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:54 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c38003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:54 np0005548916 python3.9[115951]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014533.2410674-346-274722918074657/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=b03d35dc7e50a7209707916f12027739ad55ce95 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:48:54 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:48:54 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:48:54 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:48:54.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:48:54 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:48:54 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 04:48:54 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:48:54.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 04:48:54 np0005548916 python3.9[116106]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:48:55 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:55 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c14003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:55 np0005548916 python3.9[116229]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014534.4098713-346-80246700132645/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=72139a22070e52361b83b34c98df3f4b6e2a8fd5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:48:55 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:55 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c34000b00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:56 np0005548916 python3.9[116381]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:48:56 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:56 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c4c001320 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:56 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:48:56 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:48:56 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:48:56.344 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:48:56 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:48:56 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:48:56 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:48:56.568 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:48:56 np0005548916 python3.9[116505]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014535.5992563-346-211906494804494/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=37cc98c5d0534b14ac35b3e937f249b933b173c7 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:48:57 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:57 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c38003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:57 np0005548916 python3.9[116657]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:48:57 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:57 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c34000b00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:57 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:48:58 np0005548916 python3.9[116834]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:48:58 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:58 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c14003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:58 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:48:58 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:48:58 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:48:58.348 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:48:58 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:48:58 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:48:58 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:48:58 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:48:58 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:48:58.571 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:48:58 np0005548916 python3.9[116987]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:48:59 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:59 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c4c001320 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:59 np0005548916 python3.9[117110]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014538.2281365-527-182369146049692/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=68af34295cc65bfb4aba41f49e48fcf0501e5a64 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:48:59 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:59 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c38003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:59 np0005548916 python3.9[117262]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:49:00 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:49:00 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c34000b00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:00 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:49:00 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:49:00 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:49:00.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:49:00 np0005548916 python3.9[117386]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014539.3632379-527-264476480564139/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=72139a22070e52361b83b34c98df3f4b6e2a8fd5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:49:00 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:49:00 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:49:00 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:49:00.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:49:01 np0005548916 python3.9[117538]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:49:01 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:49:01 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c14003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:01 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:49:01 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c4c001320 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:01 np0005548916 python3.9[117661]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014540.6613958-527-91508043372638/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=25ab66de5377411fb5d5a9d8f4de1740dfe1b562 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:49:02 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:49:02 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c38003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:02 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:49:02 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:49:02 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:49:02.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:49:02 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:49:02 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:49:02 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:49:02.577 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:49:02 np0005548916 python3.9[117814]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:49:02 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:49:03 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:49:03 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c34002470 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:03 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:49:03 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c14003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:03 np0005548916 python3.9[117966]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:49:04 np0005548916 python3.9[118089]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014543.0745337-729-178206841400791/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=22c202a539af259b977a1afda61dbc1fe0d1039c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:49:04 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:49:04 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c4c0092f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:04 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:49:04 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:49:04 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:49:04.357 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:49:04 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:49:04 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:49:04 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:49:04.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:49:04 np0005548916 python3.9[118242]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:49:05 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:49:05 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c38003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:05 np0005548916 python3.9[118394]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:49:05 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:49:05 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c34002470 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:05 np0005548916 python3.9[118517]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014544.91911-801-216701076085284/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=22c202a539af259b977a1afda61dbc1fe0d1039c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:49:06 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:49:06 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c14003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:06 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:49:06 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:49:06 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:49:06.361 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:49:06 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:49:06 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:49:06 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:49:06.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:49:06 np0005548916 python3.9[118670]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:49:07 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:49:07 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c4c0092f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:07 np0005548916 python3.9[118847]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:49:07 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:49:07 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c38003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:07 np0005548916 python3.9[118970]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014546.8989162-876-204554807211735/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=22c202a539af259b977a1afda61dbc1fe0d1039c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:49:07 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:49:08 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:49:08 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c34002470 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:08 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:49:08 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:49:08 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:49:08.364 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:49:08 np0005548916 python3.9[119123]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:49:08 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:49:08 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:49:08 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:49:08.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:49:09 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:49:09 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c14003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:09 np0005548916 python3.9[119275]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:49:09 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:49:09 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c4c0092f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:09 np0005548916 python3.9[119398]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014548.769744-938-263244329203227/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=22c202a539af259b977a1afda61dbc1fe0d1039c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:49:10 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:49:10 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c38003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:10 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:49:10 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:49:10 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:49:10.367 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:49:10 np0005548916 python3.9[119551]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/repo-setup setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:49:10 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:49:10 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:49:10 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:49:10.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:49:11 np0005548916 python3.9[119703]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:49:11 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:49:11 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c34003900 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:11 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:49:11 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c14003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:11 np0005548916 python3.9[119826]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014550.656027-1007-197312393562822/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=22c202a539af259b977a1afda61dbc1fe0d1039c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:49:12 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:49:12 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c4c0092f0 fd 38 proxy ignored for local
Dec  6 04:49:12 np0005548916 kernel: ganesha.nfsd[115953]: segfault at 50 ip 00007f6cfc73f32e sp 00007f6cc8ff8210 error 4 in libntirpc.so.5.8[7f6cfc724000+2c000] likely on CPU 1 (core 0, socket 1)
Dec  6 04:49:12 np0005548916 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Dec  6 04:49:12 np0005548916 systemd[1]: Started Process Core Dump (PID 119980/UID 0).
Dec  6 04:49:12 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:49:12 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:49:12 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:49:12.370 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:49:12 np0005548916 python3.9[119978]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:49:12 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:49:12 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:49:12 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:49:12.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:49:12 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:49:13 np0005548916 python3.9[120133]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:49:13 np0005548916 python3.9[120256]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014552.5999053-1079-183806100396446/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=22c202a539af259b977a1afda61dbc1fe0d1039c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:49:13 np0005548916 systemd-coredump[119981]: Process 103103 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 61:#012#0  0x00007f6cfc73f32e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Dec  6 04:49:13 np0005548916 systemd[1]: systemd-coredump@3-119980-0.service: Deactivated successfully.
Dec  6 04:49:13 np0005548916 systemd[1]: systemd-coredump@3-119980-0.service: Consumed 1.523s CPU time.
Dec  6 04:49:13 np0005548916 podman[120285]: 2025-12-06 09:49:13.925897105 +0000 UTC m=+0.037514663 container died 85455d3243db1463a68bc3199c944543828c9d708094c65b0309507d9efc87ab (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, OSD_FLAVOR=default, ceph=True, org.label-schema.schema-version=1.0)
Dec  6 04:49:13 np0005548916 systemd[1]: var-lib-containers-storage-overlay-cb68c9c4605a82178b215e3dd6a8db8454de491b3741ae6c6873e5884bb45d11-merged.mount: Deactivated successfully.
Dec  6 04:49:13 np0005548916 podman[120285]: 2025-12-06 09:49:13.970101114 +0000 UTC m=+0.081718662 container remove 85455d3243db1463a68bc3199c944543828c9d708094c65b0309507d9efc87ab (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_REF=squid, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.license=GPLv2)
Dec  6 04:49:13 np0005548916 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.0.0.compute-1.djsnbu.service: Main process exited, code=exited, status=139/n/a
Dec  6 04:49:14 np0005548916 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.0.0.compute-1.djsnbu.service: Failed with result 'exit-code'.
Dec  6 04:49:14 np0005548916 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.0.0.compute-1.djsnbu.service: Consumed 2.154s CPU time.
Dec  6 04:49:14 np0005548916 systemd[1]: session-47.scope: Deactivated successfully.
Dec  6 04:49:14 np0005548916 systemd[1]: session-47.scope: Consumed 22.841s CPU time.
Dec  6 04:49:14 np0005548916 systemd-logind[788]: Session 47 logged out. Waiting for processes to exit.
Dec  6 04:49:14 np0005548916 systemd-logind[788]: Removed session 47.
Dec  6 04:49:14 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:49:14 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:49:14 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:49:14.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:49:14 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:49:14 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:49:14 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:49:14.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:49:16 np0005548916 ceph-mon[79770]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #25. Immutable memtables: 0.
Dec  6 04:49:16 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:49:16.209871) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec  6 04:49:16 np0005548916 ceph-mon[79770]: rocksdb: [db/flush_job.cc:856] [default] [JOB 11] Flushing memtable with next log file: 25
Dec  6 04:49:16 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014556210053, "job": 11, "event": "flush_started", "num_memtables": 1, "num_entries": 1079, "num_deletes": 251, "total_data_size": 2727903, "memory_usage": 2771408, "flush_reason": "Manual Compaction"}
Dec  6 04:49:16 np0005548916 ceph-mon[79770]: rocksdb: [db/flush_job.cc:885] [default] [JOB 11] Level-0 flush table #26: started
Dec  6 04:49:16 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014556226469, "cf_name": "default", "job": 11, "event": "table_file_creation", "file_number": 26, "file_size": 1766352, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 12172, "largest_seqno": 13246, "table_properties": {"data_size": 1761486, "index_size": 2454, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1349, "raw_key_size": 10182, "raw_average_key_size": 19, "raw_value_size": 1751812, "raw_average_value_size": 3299, "num_data_blocks": 109, "num_entries": 531, "num_filter_entries": 531, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765014470, "oldest_key_time": 1765014470, "file_creation_time": 1765014556, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79d3ff52-4bd2-4fdc-8b55-33dd9c53d8e0", "db_session_id": "1TK25AVRA1WQDS4JHM8T", "orig_file_number": 26, "seqno_to_time_mapping": "N/A"}}
Dec  6 04:49:16 np0005548916 ceph-mon[79770]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 11] Flush lasted 16664 microseconds, and 9100 cpu microseconds.
Dec  6 04:49:16 np0005548916 ceph-mon[79770]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  6 04:49:16 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:49:16.226551) [db/flush_job.cc:967] [default] [JOB 11] Level-0 flush table #26: 1766352 bytes OK
Dec  6 04:49:16 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:49:16.226586) [db/memtable_list.cc:519] [default] Level-0 commit table #26 started
Dec  6 04:49:16 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:49:16.228101) [db/memtable_list.cc:722] [default] Level-0 commit table #26: memtable #1 done
Dec  6 04:49:16 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:49:16.228124) EVENT_LOG_v1 {"time_micros": 1765014556228116, "job": 11, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec  6 04:49:16 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:49:16.228177) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec  6 04:49:16 np0005548916 ceph-mon[79770]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 11] Try to delete WAL files size 2722607, prev total WAL file size 2722607, number of live WAL files 2.
Dec  6 04:49:16 np0005548916 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000022.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 04:49:16 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:49:16.229517) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F7300353032' seq:72057594037927935, type:22 .. '7061786F7300373534' seq:0, type:0; will stop at (end)
Dec  6 04:49:16 np0005548916 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 12] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec  6 04:49:16 np0005548916 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 11 Base level 0, inputs: [26(1724KB)], [24(13MB)]
Dec  6 04:49:16 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014556229686, "job": 12, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [26], "files_L6": [24], "score": -1, "input_data_size": 15982283, "oldest_snapshot_seqno": -1}
Dec  6 04:49:16 np0005548916 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 12] Generated table #27: 4294 keys, 14025129 bytes, temperature: kUnknown
Dec  6 04:49:16 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014556312772, "cf_name": "default", "job": 12, "event": "table_file_creation", "file_number": 27, "file_size": 14025129, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13993143, "index_size": 20164, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 10757, "raw_key_size": 109889, "raw_average_key_size": 25, "raw_value_size": 13911384, "raw_average_value_size": 3239, "num_data_blocks": 852, "num_entries": 4294, "num_filter_entries": 4294, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765014012, "oldest_key_time": 0, "file_creation_time": 1765014556, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79d3ff52-4bd2-4fdc-8b55-33dd9c53d8e0", "db_session_id": "1TK25AVRA1WQDS4JHM8T", "orig_file_number": 27, "seqno_to_time_mapping": "N/A"}}
Dec  6 04:49:16 np0005548916 ceph-mon[79770]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  6 04:49:16 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:49:16.313229) [db/compaction/compaction_job.cc:1663] [default] [JOB 12] Compacted 1@0 + 1@6 files to L6 => 14025129 bytes
Dec  6 04:49:16 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:49:16.314632) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 192.0 rd, 168.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.7, 13.6 +0.0 blob) out(13.4 +0.0 blob), read-write-amplify(17.0) write-amplify(7.9) OK, records in: 4812, records dropped: 518 output_compression: NoCompression
Dec  6 04:49:16 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:49:16.314653) EVENT_LOG_v1 {"time_micros": 1765014556314642, "job": 12, "event": "compaction_finished", "compaction_time_micros": 83249, "compaction_time_cpu_micros": 45166, "output_level": 6, "num_output_files": 1, "total_output_size": 14025129, "num_input_records": 4812, "num_output_records": 4294, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec  6 04:49:16 np0005548916 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000026.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 04:49:16 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014556315048, "job": 12, "event": "table_file_deletion", "file_number": 26}
Dec  6 04:49:16 np0005548916 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000024.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 04:49:16 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014556317929, "job": 12, "event": "table_file_deletion", "file_number": 24}
Dec  6 04:49:16 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:49:16.229350) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 04:49:16 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:49:16.318109) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 04:49:16 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:49:16.318118) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 04:49:16 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:49:16.318120) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 04:49:16 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:49:16.318122) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 04:49:16 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:49:16.318125) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 04:49:16 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:49:16 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:49:16 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:49:16.378 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:49:16 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:49:16 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:49:16 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:49:16.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:49:17 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:49:18 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:49:18 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:49:18 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:49:18.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:49:18 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:49:18 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:49:18 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:49:18.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:49:19 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/094919 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  6 04:49:20 np0005548916 systemd-logind[788]: New session 48 of user zuul.
Dec  6 04:49:20 np0005548916 systemd[1]: Started Session 48 of User zuul.
Dec  6 04:49:20 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:49:20 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:49:20 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:49:20.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:49:20 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:49:20 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 04:49:20 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:49:20.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 04:49:20 np0005548916 python3.9[120488]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:49:21 np0005548916 python3.9[120640]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/ceph/ceph.client.openstack.keyring follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:49:22 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:49:22 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:49:22 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:49:22.388 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:49:22 np0005548916 python3.9[120764]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/ceph/ceph.client.openstack.keyring mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1765014561.1660461-63-193965291504767/.source.keyring _original_basename=ceph.client.openstack.keyring follow=False checksum=944de880f37676f80f6e04a4864888bf3f7decbf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:49:22 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:49:22 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:49:22 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:49:22.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:49:22 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:49:23 np0005548916 python3.9[120916]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/ceph/ceph.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:49:23 np0005548916 python3.9[121039]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/ceph/ceph.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765014562.7279217-63-138893949447265/.source.conf _original_basename=ceph.conf follow=False checksum=531c84d7e2c99e4f6cf7d56dd7b16abeaf31bfa1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:49:24 np0005548916 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.0.0.compute-1.djsnbu.service: Scheduled restart job, restart counter is at 4.
Dec  6 04:49:24 np0005548916 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.djsnbu for 5ecd3f74-dade-5fc4-92ce-8950ae424258.
Dec  6 04:49:24 np0005548916 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.0.0.compute-1.djsnbu.service: Consumed 2.154s CPU time.
Dec  6 04:49:24 np0005548916 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.djsnbu for 5ecd3f74-dade-5fc4-92ce-8950ae424258...
Dec  6 04:49:24 np0005548916 systemd[1]: session-48.scope: Deactivated successfully.
Dec  6 04:49:24 np0005548916 systemd[1]: session-48.scope: Consumed 2.806s CPU time.
Dec  6 04:49:24 np0005548916 systemd-logind[788]: Session 48 logged out. Waiting for processes to exit.
Dec  6 04:49:24 np0005548916 systemd-logind[788]: Removed session 48.
Dec  6 04:49:24 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:49:24 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:49:24 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:49:24.392 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:49:24 np0005548916 podman[121110]: 2025-12-06 09:49:24.410599064 +0000 UTC m=+0.046797864 container create b9faa74dd3edb0bb8be8b8cf42ea2f255f223a99a2bba098a5ab376aa85c70c2 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1)
Dec  6 04:49:24 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f02a73c780017322d2fb31621bfb5c4ae34d571bdad46ee9ef8c108294e66d28/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Dec  6 04:49:24 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f02a73c780017322d2fb31621bfb5c4ae34d571bdad46ee9ef8c108294e66d28/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  6 04:49:24 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f02a73c780017322d2fb31621bfb5c4ae34d571bdad46ee9ef8c108294e66d28/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Dec  6 04:49:24 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f02a73c780017322d2fb31621bfb5c4ae34d571bdad46ee9ef8c108294e66d28/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.djsnbu-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Dec  6 04:49:24 np0005548916 podman[121110]: 2025-12-06 09:49:24.474200265 +0000 UTC m=+0.110399095 container init b9faa74dd3edb0bb8be8b8cf42ea2f255f223a99a2bba098a5ab376aa85c70c2 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, ceph=True, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  6 04:49:24 np0005548916 podman[121110]: 2025-12-06 09:49:24.479605459 +0000 UTC m=+0.115804259 container start b9faa74dd3edb0bb8be8b8cf42ea2f255f223a99a2bba098a5ab376aa85c70c2 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  6 04:49:24 np0005548916 bash[121110]: b9faa74dd3edb0bb8be8b8cf42ea2f255f223a99a2bba098a5ab376aa85c70c2
Dec  6 04:49:24 np0005548916 podman[121110]: 2025-12-06 09:49:24.391644993 +0000 UTC m=+0.027843793 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  6 04:49:24 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:24 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Dec  6 04:49:24 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:24 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Dec  6 04:49:24 np0005548916 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.djsnbu for 5ecd3f74-dade-5fc4-92ce-8950ae424258.
Dec  6 04:49:24 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:24 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Dec  6 04:49:24 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:24 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Dec  6 04:49:24 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:24 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Dec  6 04:49:24 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:24 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Dec  6 04:49:24 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:24 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Dec  6 04:49:24 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:24 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 04:49:24 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:49:24 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:49:24 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:49:24.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:49:26 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:49:26 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:49:26 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:49:26.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:49:26 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:49:26 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:49:26 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:49:26.621 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:49:27 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:49:28 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:49:28 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:49:28 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:49:28.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:49:28 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:49:28 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:49:28 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:49:28.626 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:49:30 np0005548916 systemd-logind[788]: New session 49 of user zuul.
Dec  6 04:49:30 np0005548916 systemd[1]: Started Session 49 of User zuul.
Dec  6 04:49:30 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:49:30 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:49:30 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:49:30.403 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:49:30 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:30 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 04:49:30 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:30 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 04:49:30 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:49:30 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:49:30 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:49:30.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:49:31 np0005548916 python3.9[121348]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  6 04:49:32 np0005548916 python3.9[121504]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:49:32 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:49:32 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:49:32 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:49:32.406 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:49:32 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:49:32 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 04:49:32 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:49:32.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 04:49:32 np0005548916 python3.9[121657]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:49:32 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:49:33 np0005548916 python3.9[121807]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  6 04:49:34 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:49:34 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:49:34 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:49:34.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:49:34 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:49:34 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:49:34 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:49:34.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:49:34 np0005548916 python3.9[121960]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Dec  6 04:49:35 np0005548916 ceph-osd[77465]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec  6 04:49:35 np0005548916 ceph-osd[77465]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 600.1 total, 600.0 interval#012Cumulative writes: 7179 writes, 30K keys, 7179 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.03 MB/s#012Cumulative WAL: 7179 writes, 1333 syncs, 5.39 writes per sync, written: 0.02 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 7179 writes, 30K keys, 7179 commit groups, 1.0 writes per commit group, ingest: 20.58 MB, 0.03 MB/s#012Interval WAL: 7179 writes, 1333 syncs, 5.39 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.01              0.00         1    0.013       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.01              0.00         1    0.013       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.1      0.01              0.00         1    0.013       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55fb227cf350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 0.000158 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55fb227cf350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 0.000158 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtab
Dec  6 04:49:36 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:49:36 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:49:36 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:49:36.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:49:36 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:49:36 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:36 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  6 04:49:36 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:49:36 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:36 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Dec  6 04:49:36 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:36 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Dec  6 04:49:36 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:36 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Dec  6 04:49:36 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:36 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Dec  6 04:49:36 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:36 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Dec  6 04:49:36 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:36 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Dec  6 04:49:36 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:36 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  6 04:49:36 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:36 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  6 04:49:36 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:36 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  6 04:49:36 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:49:36.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:49:36 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:36 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Dec  6 04:49:36 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:36 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  6 04:49:36 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:36 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Dec  6 04:49:36 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:36 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Dec  6 04:49:36 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:36 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Dec  6 04:49:36 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:36 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Dec  6 04:49:36 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:36 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Dec  6 04:49:36 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:36 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Dec  6 04:49:36 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:36 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Dec  6 04:49:36 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:36 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Dec  6 04:49:36 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:36 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Dec  6 04:49:36 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:36 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Dec  6 04:49:36 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:36 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Dec  6 04:49:36 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:36 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Dec  6 04:49:36 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:36 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec  6 04:49:36 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:36 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Dec  6 04:49:36 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:36 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec  6 04:49:36 np0005548916 dbus-broker-launch[774]: avc:  op=load_policy lsm=selinux seqno=11 res=1
Dec  6 04:49:37 np0005548916 python3.9[122129]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  6 04:49:37 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:37 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:37 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:37 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f58001970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:37 np0005548916 python3.9[122216]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  6 04:49:37 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:49:38 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:38 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f48000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:38 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:49:38 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:49:38 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:49:38.415 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:49:38 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:49:38 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:49:38 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:49:38.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:49:39 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:39 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f40000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:39 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:39 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c001bd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:39 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/094939 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  6 04:49:40 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:40 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f58002290 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:40 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:49:40 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:49:40 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:49:40.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:49:40 np0005548916 python3.9[122371]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec  6 04:49:40 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:49:40 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:49:40 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:49:40.646 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:49:41 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:41 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f480016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:41 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:41 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f400016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:41 np0005548916 python3[122526]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks#012  rule:#012    proto: udp#012    dport: 4789#012- rule_name: 119 neutron geneve networks#012  rule:#012    proto: udp#012    dport: 6081#012    state: ["UNTRACKED"]#012- rule_name: 120 neutron geneve networks no conntrack#012  rule:#012    proto: udp#012    dport: 6081#012    table: raw#012    chain: OUTPUT#012    jump: NOTRACK#012    action: append#012    state: []#012- rule_name: 121 neutron geneve networks no conntrack#012  rule:#012    proto: udp#012    dport: 6081#012    table: raw#012    chain: PREROUTING#012    jump: NOTRACK#012    action: append#012    state: []#012 dest=/var/lib/edpm-config/firewall/ovn.yaml state=present
Dec  6 04:49:42 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:42 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c001bd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:42 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:49:42 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:49:42 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:49:42.423 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:49:42 np0005548916 python3.9[122679]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:49:42 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:49:42 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:49:42 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:49:42.650 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:49:42 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:49:43 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:43 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f58002290 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:43 np0005548916 python3.9[122831]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:49:43 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:43 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f480016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:43 np0005548916 python3.9[122909]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:49:44 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:44 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f400016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:44 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:49:44 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:49:44 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:49:44.426 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:49:44 np0005548916 python3.9[123062]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:49:44 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:49:44 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:49:44 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:49:44.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:49:45 np0005548916 python3.9[123140]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.os1jjdd_ recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:49:45 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:45 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c0089d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:45 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:45 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f58002290 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:45 np0005548916 python3.9[123292]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:49:46 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:46 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f480016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:46 np0005548916 python3.9[123370]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:49:46 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:49:46 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:49:46 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:49:46.429 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:49:46 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:49:46 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:49:46 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:49:46.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:49:47 np0005548916 python3.9[123548]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:49:47 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:47 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f400016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:47 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:47 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c0089d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:47 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:49:48 np0005548916 python3[123701]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Dec  6 04:49:48 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:48 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f58002290 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:48 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:49:48 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:49:48 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:49:48.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:49:48 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:49:48 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:49:48 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:49:48.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:49:48 np0005548916 python3.9[123854]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:49:49 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:49 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f48002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:49 np0005548916 python3.9[123979]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014588.2467675-432-65815724871084/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:49:49 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:49 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f40002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:50 np0005548916 python3.9[124131]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:49:50 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:50 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c0096e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:50 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:49:50 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:49:50 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:49:50.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:49:50 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:49:50 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:49:50 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:49:50.663 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:49:50 np0005548916 python3.9[124257]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014589.6480289-477-13677852418489/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:49:51 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:51 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f58002290 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:51 np0005548916 python3.9[124409]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:49:51 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:51 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f48002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:52 np0005548916 python3.9[124534]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014591.054216-522-66352359629040/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:49:52 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:52 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f40002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:52 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:49:52 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:49:52 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:49:52.441 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:49:52 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:49:52 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:49:52 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:49:52.667 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:49:52 np0005548916 python3.9[124687]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:49:52 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:49:53 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:53 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c0096e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:53 np0005548916 python3.9[124812]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014592.3433285-567-229076604709049/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:49:53 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:53 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f58002290 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:54 np0005548916 python3.9[124964]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:49:54 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:54 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f48002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:54 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:49:54 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:49:54 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:49:54.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:49:54 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:49:54 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:49:54 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:49:54.670 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:49:54 np0005548916 python3.9[125090]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014593.6718242-612-54795851242859/.source.nft follow=False _original_basename=ruleset.j2 checksum=bdba38546f86123f1927359d89789bd211aba99d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:49:55 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:55 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f48002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:55 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:55 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c0096e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:55 np0005548916 python3.9[125242]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:49:56 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:56 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c0096e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:56 np0005548916 python3.9[125395]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:49:56 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:49:56 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:49:56 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:49:56.448 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:49:56 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:49:56 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:49:56 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:49:56.673 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:49:57 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:57 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c0096e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:57 np0005548916 python3.9[125550]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:49:57 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:57 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c0096e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:57 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:49:58 np0005548916 python3.9[125752]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:49:58 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:58 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f64001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:58 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:49:58 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:49:58 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:49:58.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:49:58 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:49:58 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:49:58 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:49:58.678 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:49:58 np0005548916 python3.9[125937]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  6 04:49:59 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:59 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f58002290 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:59 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:49:59 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:49:59 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  6 04:49:59 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:49:59 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:49:59 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  6 04:49:59 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:59 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f48002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:59 np0005548916 python3.9[126091]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:50:00 np0005548916 ceph-mon[79770]: overall HEALTH_OK
Dec  6 04:50:00 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:50:00 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c0096e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:00 np0005548916 python3.9[126247]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:50:00 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:50:00 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:50:00 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:50:00.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:50:00 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:50:00 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:50:00 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:50:00.682 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:50:01 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:50:01 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f640025c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:01 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:50:01 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f58002290 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:01 np0005548916 python3.9[126397]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  6 04:50:02 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:50:02 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f48003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:02 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:50:02 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:50:02 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:50:02.459 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:50:02 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:50:02 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:50:02 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:50:02.685 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:50:02 np0005548916 python3.9[126551]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=compute-1.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:1e:0a:f2:93:49:d5" external_ids:ovn-encap-ip=172.19.0.100 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch #012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:50:02 np0005548916 ovs-vsctl[126552]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=compute-1.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:1e:0a:f2:93:49:d5 external_ids:ovn-encap-ip=172.19.0.100 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch
Dec  6 04:50:02 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:50:03 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:50:03 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c00a3f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:03 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:50:03 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f640025c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:03 np0005548916 python3.9[126704]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail#012ovs-vsctl show | grep -q "Manager"#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:50:04 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:50:04 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f58002290 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:04 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:50:04 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:50:04 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:50:04.462 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:50:04 np0005548916 python3.9[126885]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl --timeout=5 --id=@manager -- create Manager target=\"ptcp:********@manager#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:50:04 np0005548916 ovs-vsctl[126886]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager
Dec  6 04:50:04 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:50:04 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:50:04 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:50:04.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:50:05 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:50:05 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:50:05 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:50:05 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f48003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:05 np0005548916 python3.9[127036]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  6 04:50:05 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:50:05 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c00a3f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:06 np0005548916 python3.9[127190]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:50:06 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:50:06 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f640032d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:06 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:50:06 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:50:06 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:50:06.465 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:50:06 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:50:06 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:50:06 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:50:06.692 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:50:06 np0005548916 python3.9[127343]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:50:07 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:50:07 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f58002290 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:07 np0005548916 python3.9[127446]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:50:07 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:50:07 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f48003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:07 np0005548916 python3.9[127598]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:50:07 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:50:08 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:50:08 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c00a3f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:08 np0005548916 python3.9[127677]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:50:08 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:50:08 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:50:08 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:50:08.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:50:08 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:50:08 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:50:08 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:50:08.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:50:09 np0005548916 python3.9[127830]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:50:09 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:50:09 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c00a3f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:09 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:50:09 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f3c000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:09 np0005548916 python3.9[127982]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:50:10 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:50:10 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f48003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:10 np0005548916 python3.9[128060]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:50:10 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:50:10 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:50:10 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:50:10.473 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:50:10 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:50:10 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:50:10 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:50:10.700 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:50:10 np0005548916 python3.9[128213]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:50:11 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:50:11 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f640032d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:11 np0005548916 python3.9[128291]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:50:11 np0005548916 kernel: ganesha.nfsd[122066]: segfault at 50 ip 00007f801805232e sp 00007f7fe37fd210 error 4 in libntirpc.so.5.8[7f8018037000+2c000] likely on CPU 6 (core 0, socket 6)
Dec  6 04:50:11 np0005548916 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Dec  6 04:50:11 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:50:11 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c00a3f0 fd 38 proxy ignored for local
Dec  6 04:50:11 np0005548916 systemd[1]: Started Process Core Dump (PID 128316/UID 0).
Dec  6 04:50:12 np0005548916 python3.9[128445]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  6 04:50:12 np0005548916 systemd[1]: Reloading.
Dec  6 04:50:12 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:50:12 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:50:12 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:50:12.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:50:12 np0005548916 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:50:12 np0005548916 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:50:12 np0005548916 ceph-mon[79770]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec  6 04:50:12 np0005548916 ceph-mon[79770]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 600.0 total, 600.0 interval#012Cumulative writes: 2212 writes, 13K keys, 2212 commit groups, 1.0 writes per commit group, ingest: 0.04 GB, 0.06 MB/s#012Cumulative WAL: 2212 writes, 2212 syncs, 1.00 writes per sync, written: 0.04 GB, 0.06 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 2212 writes, 13K keys, 2212 commit groups, 1.0 writes per commit group, ingest: 38.35 MB, 0.06 MB/s#012Interval WAL: 2212 writes, 2212 syncs, 1.00 writes per sync, written: 0.04 GB, 0.06 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0    113.2      0.19              0.09         6    0.032       0      0       0.0       0.0#012  L6      1/0   13.38 MB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   3.0    106.1     93.7      0.70              0.28         5    0.140     21K   2281       0.0       0.0#012 Sum      1/0   13.38 MB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   4.0     83.0     97.9      0.89              0.37        11    0.081     21K   2281       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   4.0     83.2     98.2      0.89              0.37        10    0.089     21K   2281       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   0.0    106.1     93.7      0.70              0.28         5    0.140     21K   2281       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0    114.4      0.19              0.09         5    0.039       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.0 total, 600.0 interval#012Flush(GB): cumulative 0.022, interval 0.022#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.09 GB write, 0.15 MB/s write, 0.07 GB read, 0.12 MB/s read, 0.9 seconds#012Interval compaction: 0.09 GB write, 0.15 MB/s write, 0.07 GB read, 0.12 MB/s read, 0.9 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55fbbecff350#2 capacity: 304.00 MB usage: 1.62 MB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 0 last_secs: 0.000112 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(91,1.41 MB,0.464475%) FilterBlock(11,71.42 KB,0.0229434%) IndexBlock(11,143.02 KB,0.045942%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Dec  6 04:50:12 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:50:12 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:50:12 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:50:12.703 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:50:12 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:50:13 np0005548916 systemd-coredump[128317]: Process 121129 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 41:#012#0  0x00007f801805232e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Dec  6 04:50:13 np0005548916 systemd[1]: systemd-coredump@4-128316-0.service: Deactivated successfully.
Dec  6 04:50:13 np0005548916 systemd[1]: systemd-coredump@4-128316-0.service: Consumed 1.545s CPU time.
Dec  6 04:50:13 np0005548916 podman[128567]: 2025-12-06 09:50:13.318367316 +0000 UTC m=+0.037156838 container died b9faa74dd3edb0bb8be8b8cf42ea2f255f223a99a2bba098a5ab376aa85c70c2 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  6 04:50:13 np0005548916 systemd[1]: var-lib-containers-storage-overlay-f02a73c780017322d2fb31621bfb5c4ae34d571bdad46ee9ef8c108294e66d28-merged.mount: Deactivated successfully.
Dec  6 04:50:13 np0005548916 podman[128567]: 2025-12-06 09:50:13.36296989 +0000 UTC m=+0.081759392 container remove b9faa74dd3edb0bb8be8b8cf42ea2f255f223a99a2bba098a5ab376aa85c70c2 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325)
Dec  6 04:50:13 np0005548916 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.0.0.compute-1.djsnbu.service: Main process exited, code=exited, status=139/n/a
Dec  6 04:50:13 np0005548916 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.0.0.compute-1.djsnbu.service: Failed with result 'exit-code'.
Dec  6 04:50:13 np0005548916 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.0.0.compute-1.djsnbu.service: Consumed 1.886s CPU time.
Dec  6 04:50:13 np0005548916 python3.9[128676]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:50:14 np0005548916 python3.9[128757]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:50:14 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:50:14 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:50:14 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:50:14.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:50:14 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:50:14 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:50:14 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:50:14.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:50:14 np0005548916 python3.9[128910]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:50:15 np0005548916 python3.9[128988]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:50:16 np0005548916 python3.9[129140]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  6 04:50:16 np0005548916 systemd[1]: Reloading.
Dec  6 04:50:16 np0005548916 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:50:16 np0005548916 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:50:16 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:50:16 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:50:16 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:50:16.482 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:50:16 np0005548916 systemd[1]: Starting Create netns directory...
Dec  6 04:50:16 np0005548916 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec  6 04:50:16 np0005548916 systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec  6 04:50:16 np0005548916 systemd[1]: Finished Create netns directory.
Dec  6 04:50:16 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:50:16 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:50:16 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:50:16.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:50:17 np0005548916 python3.9[129333]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:50:17 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/095017 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  6 04:50:17 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:50:18 np0005548916 python3.9[129485]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:50:18 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:50:18 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:50:18 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:50:18.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:50:18 np0005548916 python3.9[129609]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765014617.5573714-1365-189790454814158/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:50:18 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:50:18 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:50:18 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:50:18.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:50:19 np0005548916 python3.9[129761]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:50:20 np0005548916 python3.9[129913]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:50:20 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:50:20 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:50:20 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:50:20.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:50:20 np0005548916 python3.9[130037]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1765014619.7168067-1440-214802979455028/.source.json _original_basename=.zws48ewl follow=False checksum=2328fc98619beeb08ee32b01f15bb43094c10b61 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:50:20 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:50:20 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:50:20 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:50:20.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:50:21 np0005548916 python3.9[130189]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:50:22 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:50:22 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:50:22 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:50:22.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:50:22 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:50:22 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:50:22 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:50:22.719 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:50:22 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:50:23 np0005548916 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.0.0.compute-1.djsnbu.service: Scheduled restart job, restart counter is at 5.
Dec  6 04:50:23 np0005548916 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.djsnbu for 5ecd3f74-dade-5fc4-92ce-8950ae424258.
Dec  6 04:50:23 np0005548916 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.0.0.compute-1.djsnbu.service: Consumed 1.886s CPU time.
Dec  6 04:50:23 np0005548916 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.djsnbu for 5ecd3f74-dade-5fc4-92ce-8950ae424258...
Dec  6 04:50:23 np0005548916 podman[130665]: 2025-12-06 09:50:23.778924366 +0000 UTC m=+0.045535468 container create aeb4a191b30e3d0e639fe714012cb8167b13d0245e7a274e7aa6d996a80dbf01 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, OSD_FLAVOR=default, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  6 04:50:23 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/40002f69ba8adcfb87c67e5821ee91e412b0c3574a69c61d93dc56a081e3f1b8/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Dec  6 04:50:23 np0005548916 python3.9[130636]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False
Dec  6 04:50:23 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/40002f69ba8adcfb87c67e5821ee91e412b0c3574a69c61d93dc56a081e3f1b8/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  6 04:50:23 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/40002f69ba8adcfb87c67e5821ee91e412b0c3574a69c61d93dc56a081e3f1b8/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Dec  6 04:50:23 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/40002f69ba8adcfb87c67e5821ee91e412b0c3574a69c61d93dc56a081e3f1b8/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.djsnbu-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Dec  6 04:50:23 np0005548916 podman[130665]: 2025-12-06 09:50:23.758740848 +0000 UTC m=+0.025351990 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  6 04:50:23 np0005548916 podman[130665]: 2025-12-06 09:50:23.861955329 +0000 UTC m=+0.128566461 container init aeb4a191b30e3d0e639fe714012cb8167b13d0245e7a274e7aa6d996a80dbf01 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, OSD_FLAVOR=default, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  6 04:50:23 np0005548916 podman[130665]: 2025-12-06 09:50:23.867714034 +0000 UTC m=+0.134325146 container start aeb4a191b30e3d0e639fe714012cb8167b13d0245e7a274e7aa6d996a80dbf01 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, io.buildah.version=1.40.1, OSD_FLAVOR=default)
Dec  6 04:50:23 np0005548916 bash[130665]: aeb4a191b30e3d0e639fe714012cb8167b13d0245e7a274e7aa6d996a80dbf01
Dec  6 04:50:23 np0005548916 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.djsnbu for 5ecd3f74-dade-5fc4-92ce-8950ae424258.
Dec  6 04:50:23 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:23 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Dec  6 04:50:23 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:23 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Dec  6 04:50:23 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:23 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Dec  6 04:50:23 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:23 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Dec  6 04:50:23 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:23 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Dec  6 04:50:23 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:23 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Dec  6 04:50:23 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:23 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Dec  6 04:50:23 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:23 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 04:50:24 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:50:24 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:50:24 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:50:24.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:50:24 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:50:24 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:50:24 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:50:24.723 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:50:24 np0005548916 python3.9[130875]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec  6 04:50:25 np0005548916 python3.9[131027]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Dec  6 04:50:26 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:50:26 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:50:26 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:50:26.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:50:26 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:50:26 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:50:26 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:50:26.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:50:27 np0005548916 python3[131232]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Dec  6 04:50:27 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:50:28 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:50:28 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:50:28 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:50:28.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:50:28 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:50:28 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:50:28 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:50:28.730 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:50:30 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:30 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 04:50:30 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:30 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 04:50:30 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:50:30 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:50:30 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:50:30.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:50:30 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:50:30 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:50:30 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:50:30.733 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:50:32 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:50:32 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:50:32 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:50:32.509 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:50:32 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:50:32 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:50:32 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:50:32.735 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:50:32 np0005548916 podman[131246]: 2025-12-06 09:50:32.823378695 +0000 UTC m=+5.080776288 image pull 3a37a52861b2e44ebd2a63ca2589a7c9d8e4119e5feace9d19c6312ed9b8421c quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c
Dec  6 04:50:32 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:50:33 np0005548916 podman[131368]: 2025-12-06 09:50:33.011490176 +0000 UTC m=+0.072737454 container create 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Dec  6 04:50:33 np0005548916 podman[131368]: 2025-12-06 09:50:32.974223137 +0000 UTC m=+0.035470495 image pull 3a37a52861b2e44ebd2a63ca2589a7c9d8e4119e5feace9d19c6312ed9b8421c quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c
Dec  6 04:50:33 np0005548916 python3[131232]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c
Dec  6 04:50:34 np0005548916 python3.9[131557]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  6 04:50:34 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:50:34 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:50:34 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:50:34.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:50:34 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:50:34 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:50:34 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:50:34.739 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:50:35 np0005548916 python3.9[131712]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:50:35 np0005548916 python3.9[131788]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  6 04:50:36 np0005548916 python3.9[131939]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765014635.641164-1704-110454569519603/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:50:36 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:36 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  6 04:50:36 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:36 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Dec  6 04:50:36 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:36 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Dec  6 04:50:36 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:36 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Dec  6 04:50:36 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:36 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Dec  6 04:50:36 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:36 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Dec  6 04:50:36 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:36 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Dec  6 04:50:36 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:36 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  6 04:50:36 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:36 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  6 04:50:36 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:36 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  6 04:50:36 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:36 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Dec  6 04:50:36 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:36 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  6 04:50:36 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:36 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Dec  6 04:50:36 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:36 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Dec  6 04:50:36 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:36 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Dec  6 04:50:36 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:36 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Dec  6 04:50:36 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:36 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Dec  6 04:50:36 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:36 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Dec  6 04:50:36 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:36 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Dec  6 04:50:36 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:36 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Dec  6 04:50:36 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:36 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Dec  6 04:50:36 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:36 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Dec  6 04:50:36 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:36 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Dec  6 04:50:36 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:36 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Dec  6 04:50:36 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:36 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec  6 04:50:36 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:36 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Dec  6 04:50:36 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:36 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec  6 04:50:36 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:50:36 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:50:36 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:50:36.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:50:36 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:50:36 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:50:36 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:50:36.743 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:50:36 np0005548916 python3.9[132028]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec  6 04:50:36 np0005548916 systemd[1]: Reloading.
Dec  6 04:50:37 np0005548916 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:50:37 np0005548916 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:50:37 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:37 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8388000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:37 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:37 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:37 np0005548916 python3.9[132141]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  6 04:50:37 np0005548916 systemd[1]: Reloading.
Dec  6 04:50:37 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:50:37 np0005548916 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:50:37 np0005548916 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:50:38 np0005548916 systemd[1]: Starting ovn_controller container...
Dec  6 04:50:38 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:38 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8364000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:38 np0005548916 systemd[1]: Started libcrun container.
Dec  6 04:50:38 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7ca9670be7d4b8862f9a7ddfabd1ceaf608d59f574346d1575185ef1bc74ed2b/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Dec  6 04:50:38 np0005548916 systemd[1]: Started /usr/bin/podman healthcheck run 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c.
Dec  6 04:50:38 np0005548916 podman[132184]: 2025-12-06 09:50:38.381974734 +0000 UTC m=+0.164369423 container init 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec  6 04:50:38 np0005548916 ovn_controller[132199]: + sudo -E kolla_set_configs
Dec  6 04:50:38 np0005548916 podman[132184]: 2025-12-06 09:50:38.412007551 +0000 UTC m=+0.194402250 container start 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, managed_by=edpm_ansible)
Dec  6 04:50:38 np0005548916 edpm-start-podman-container[132184]: ovn_controller
Dec  6 04:50:38 np0005548916 systemd[1]: Created slice User Slice of UID 0.
Dec  6 04:50:38 np0005548916 systemd[1]: Starting User Runtime Directory /run/user/0...
Dec  6 04:50:38 np0005548916 systemd[1]: Finished User Runtime Directory /run/user/0.
Dec  6 04:50:38 np0005548916 systemd[1]: Starting User Manager for UID 0...
Dec  6 04:50:38 np0005548916 edpm-start-podman-container[132182]: Creating additional drop-in dependency for "ovn_controller" (00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c)
Dec  6 04:50:38 np0005548916 podman[132207]: 2025-12-06 09:50:38.505867987 +0000 UTC m=+0.079983347 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=starting, health_failing_streak=1, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3)
Dec  6 04:50:38 np0005548916 systemd[1]: 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c-7073a8bb21fd3bfa.service: Main process exited, code=exited, status=1/FAILURE
Dec  6 04:50:38 np0005548916 systemd[1]: 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c-7073a8bb21fd3bfa.service: Failed with result 'exit-code'.
Dec  6 04:50:38 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:50:38 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:50:38 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:50:38.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:50:38 np0005548916 systemd[1]: Reloading.
Dec  6 04:50:38 np0005548916 systemd[132242]: Queued start job for default target Main User Target.
Dec  6 04:50:38 np0005548916 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:50:38 np0005548916 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:50:38 np0005548916 systemd[132242]: Created slice User Application Slice.
Dec  6 04:50:38 np0005548916 systemd[132242]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Dec  6 04:50:38 np0005548916 systemd[132242]: Started Daily Cleanup of User's Temporary Directories.
Dec  6 04:50:38 np0005548916 systemd[132242]: Reached target Paths.
Dec  6 04:50:38 np0005548916 systemd[132242]: Reached target Timers.
Dec  6 04:50:38 np0005548916 systemd[132242]: Starting D-Bus User Message Bus Socket...
Dec  6 04:50:38 np0005548916 systemd[132242]: Starting Create User's Volatile Files and Directories...
Dec  6 04:50:38 np0005548916 systemd[132242]: Finished Create User's Volatile Files and Directories.
Dec  6 04:50:38 np0005548916 systemd[132242]: Listening on D-Bus User Message Bus Socket.
Dec  6 04:50:38 np0005548916 systemd[132242]: Reached target Sockets.
Dec  6 04:50:38 np0005548916 systemd[132242]: Reached target Basic System.
Dec  6 04:50:38 np0005548916 systemd[132242]: Reached target Main User Target.
Dec  6 04:50:38 np0005548916 systemd[132242]: Startup finished in 135ms.
Dec  6 04:50:38 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:50:38 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:50:38 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:50:38.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:50:38 np0005548916 systemd[1]: Started User Manager for UID 0.
Dec  6 04:50:38 np0005548916 systemd[1]: Started ovn_controller container.
Dec  6 04:50:38 np0005548916 systemd[1]: Started Session c1 of User root.
Dec  6 04:50:38 np0005548916 ovn_controller[132199]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec  6 04:50:38 np0005548916 ovn_controller[132199]: INFO:__main__:Validating config file
Dec  6 04:50:38 np0005548916 ovn_controller[132199]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec  6 04:50:38 np0005548916 ovn_controller[132199]: INFO:__main__:Writing out command to execute
Dec  6 04:50:38 np0005548916 systemd[1]: session-c1.scope: Deactivated successfully.
Dec  6 04:50:38 np0005548916 ovn_controller[132199]: ++ cat /run_command
Dec  6 04:50:38 np0005548916 ovn_controller[132199]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Dec  6 04:50:38 np0005548916 ovn_controller[132199]: + ARGS=
Dec  6 04:50:38 np0005548916 ovn_controller[132199]: + sudo kolla_copy_cacerts
Dec  6 04:50:38 np0005548916 systemd[1]: Started Session c2 of User root.
Dec  6 04:50:38 np0005548916 ovn_controller[132199]: + [[ ! -n '' ]]
Dec  6 04:50:38 np0005548916 systemd[1]: session-c2.scope: Deactivated successfully.
Dec  6 04:50:38 np0005548916 ovn_controller[132199]: + . kolla_extend_start
Dec  6 04:50:38 np0005548916 ovn_controller[132199]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Dec  6 04:50:38 np0005548916 ovn_controller[132199]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '\'''
Dec  6 04:50:38 np0005548916 ovn_controller[132199]: + umask 0022
Dec  6 04:50:38 np0005548916 ovn_controller[132199]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt
Dec  6 04:50:38 np0005548916 ovn_controller[132199]: 2025-12-06T09:50:38Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Dec  6 04:50:38 np0005548916 ovn_controller[132199]: 2025-12-06T09:50:38Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Dec  6 04:50:38 np0005548916 ovn_controller[132199]: 2025-12-06T09:50:38Z|00003|main|INFO|OVN internal version is : [24.03.8-20.33.0-76.8]
Dec  6 04:50:38 np0005548916 ovn_controller[132199]: 2025-12-06T09:50:38Z|00004|main|INFO|OVS IDL reconnected, force recompute.
Dec  6 04:50:38 np0005548916 ovn_controller[132199]: 2025-12-06T09:50:38Z|00005|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Dec  6 04:50:38 np0005548916 ovn_controller[132199]: 2025-12-06T09:50:38Z|00006|main|INFO|OVNSB IDL reconnected, force recompute.
Dec  6 04:50:38 np0005548916 NetworkManager[48956]: <info>  [1765014638.9687] manager: (br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/16)
Dec  6 04:50:38 np0005548916 NetworkManager[48956]: <info>  [1765014638.9701] device (br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  6 04:50:38 np0005548916 NetworkManager[48956]: <info>  [1765014638.9722] manager: (br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/17)
Dec  6 04:50:38 np0005548916 NetworkManager[48956]: <info>  [1765014638.9731] manager: (br-int): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/18)
Dec  6 04:50:38 np0005548916 NetworkManager[48956]: <info>  [1765014638.9737] device (br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Dec  6 04:50:38 np0005548916 kernel: br-int: entered promiscuous mode
Dec  6 04:50:38 np0005548916 ovn_controller[132199]: 2025-12-06T09:50:38Z|00007|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connected
Dec  6 04:50:38 np0005548916 ovn_controller[132199]: 2025-12-06T09:50:38Z|00008|features|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Dec  6 04:50:38 np0005548916 ovn_controller[132199]: 2025-12-06T09:50:38Z|00009|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Dec  6 04:50:38 np0005548916 ovn_controller[132199]: 2025-12-06T09:50:38Z|00010|features|INFO|OVS Feature: ct_zero_snat, state: supported
Dec  6 04:50:38 np0005548916 ovn_controller[132199]: 2025-12-06T09:50:38Z|00011|features|INFO|OVS Feature: ct_flush, state: supported
Dec  6 04:50:38 np0005548916 ovn_controller[132199]: 2025-12-06T09:50:38Z|00012|features|INFO|OVS Feature: dp_hash_l4_sym_support, state: supported
Dec  6 04:50:38 np0005548916 ovn_controller[132199]: 2025-12-06T09:50:38Z|00013|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Dec  6 04:50:38 np0005548916 ovn_controller[132199]: 2025-12-06T09:50:38Z|00014|main|INFO|OVS feature set changed, force recompute.
Dec  6 04:50:38 np0005548916 ovn_controller[132199]: 2025-12-06T09:50:38Z|00015|ofctrl|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Dec  6 04:50:38 np0005548916 ovn_controller[132199]: 2025-12-06T09:50:38Z|00016|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Dec  6 04:50:38 np0005548916 ovn_controller[132199]: 2025-12-06T09:50:38Z|00017|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Dec  6 04:50:38 np0005548916 ovn_controller[132199]: 2025-12-06T09:50:38Z|00018|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms)
Dec  6 04:50:38 np0005548916 ovn_controller[132199]: 2025-12-06T09:50:38Z|00019|main|INFO|OVS OpenFlow connection reconnected,force recompute.
Dec  6 04:50:38 np0005548916 ovn_controller[132199]: 2025-12-06T09:50:38Z|00020|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Dec  6 04:50:38 np0005548916 ovn_controller[132199]: 2025-12-06T09:50:38Z|00021|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Dec  6 04:50:38 np0005548916 ovn_controller[132199]: 2025-12-06T09:50:38Z|00022|main|INFO|OVS feature set changed, force recompute.
Dec  6 04:50:38 np0005548916 ovn_controller[132199]: 2025-12-06T09:50:38Z|00023|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4
Dec  6 04:50:38 np0005548916 ovn_controller[132199]: 2025-12-06T09:50:38Z|00024|main|INFO|Setting flow table prefixes: ip_src, ip_dst, ipv6_src, ipv6_dst.
Dec  6 04:50:38 np0005548916 ovn_controller[132199]: 2025-12-06T09:50:38Z|00001|statctrl(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Dec  6 04:50:38 np0005548916 ovn_controller[132199]: 2025-12-06T09:50:38Z|00001|pinctrl(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Dec  6 04:50:38 np0005548916 ovn_controller[132199]: 2025-12-06T09:50:38Z|00002|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Dec  6 04:50:38 np0005548916 ovn_controller[132199]: 2025-12-06T09:50:38Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Dec  6 04:50:38 np0005548916 ovn_controller[132199]: 2025-12-06T09:50:38Z|00003|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Dec  6 04:50:38 np0005548916 ovn_controller[132199]: 2025-12-06T09:50:38Z|00003|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Dec  6 04:50:38 np0005548916 NetworkManager[48956]: <info>  [1765014638.9924] manager: (ovn-1b31b2-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/19)
Dec  6 04:50:38 np0005548916 NetworkManager[48956]: <info>  [1765014638.9933] manager: (ovn-127282-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/20)
Dec  6 04:50:38 np0005548916 NetworkManager[48956]: <info>  [1765014638.9939] manager: (ovn-d39b5b-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/21)
Dec  6 04:50:39 np0005548916 systemd-udevd[132338]: Network interface NamePolicy= disabled on kernel command line.
Dec  6 04:50:39 np0005548916 kernel: genev_sys_6081: entered promiscuous mode
Dec  6 04:50:39 np0005548916 systemd-udevd[132339]: Network interface NamePolicy= disabled on kernel command line.
Dec  6 04:50:39 np0005548916 NetworkManager[48956]: <info>  [1765014639.0154] device (genev_sys_6081): carrier: link connected
Dec  6 04:50:39 np0005548916 NetworkManager[48956]: <info>  [1765014639.0162] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/22)
Dec  6 04:50:39 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:39 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8360000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:39 np0005548916 python3.9[132470]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:50:39 np0005548916 ovs-vsctl[132471]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload
Dec  6 04:50:39 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:39 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f836c000fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:39 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/095039 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  6 04:50:40 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:40 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:40 np0005548916 python3.9[132623]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:50:40 np0005548916 ovs-vsctl[132626]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids
Dec  6 04:50:40 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:50:40 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:50:40 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:50:40.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:50:40 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:50:40 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:50:40 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:50:40.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:50:41 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:41 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f83640016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:41 np0005548916 python3.9[132779]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:50:41 np0005548916 ovs-vsctl[132780]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
Dec  6 04:50:41 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:41 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f836c000fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:41 np0005548916 systemd[1]: session-49.scope: Deactivated successfully.
Dec  6 04:50:41 np0005548916 systemd[1]: session-49.scope: Consumed 59.975s CPU time.
Dec  6 04:50:41 np0005548916 systemd-logind[788]: Session 49 logged out. Waiting for processes to exit.
Dec  6 04:50:41 np0005548916 systemd-logind[788]: Removed session 49.
Dec  6 04:50:42 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:42 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f83600016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:42 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:50:42 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:50:42 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:50:42.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:50:42 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:50:42 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:50:42 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:50:42.756 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:50:42 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:50:43 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:43 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:43 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:43 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f83640016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:44 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:44 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f836c001f40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:44 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:50:44 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:50:44 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:50:44.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:50:44 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:50:44 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:50:44 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:50:44.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:50:45 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:45 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8360001fc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:45 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:45 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:46 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:46 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f83640016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:46 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:50:46 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:50:46 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:50:46.537 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:50:46 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:50:46 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:50:46 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:50:46.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:50:47 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:47 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f836c001f40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:47 np0005548916 systemd-logind[788]: New session 51 of user zuul.
Dec  6 04:50:47 np0005548916 systemd[1]: Started Session 51 of User zuul.
Dec  6 04:50:47 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:47 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8360001fc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:47 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:50:48 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:48 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:48 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:50:48 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:50:48 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:50:48.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:50:48 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:50:48 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:50:48 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:50:48.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:50:49 np0005548916 python3.9[132987]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  6 04:50:49 np0005548916 systemd[1]: Stopping User Manager for UID 0...
Dec  6 04:50:49 np0005548916 systemd[132242]: Activating special unit Exit the Session...
Dec  6 04:50:49 np0005548916 systemd[132242]: Stopped target Main User Target.
Dec  6 04:50:49 np0005548916 systemd[132242]: Stopped target Basic System.
Dec  6 04:50:49 np0005548916 systemd[132242]: Stopped target Paths.
Dec  6 04:50:49 np0005548916 systemd[132242]: Stopped target Sockets.
Dec  6 04:50:49 np0005548916 systemd[132242]: Stopped target Timers.
Dec  6 04:50:49 np0005548916 systemd[132242]: Stopped Daily Cleanup of User's Temporary Directories.
Dec  6 04:50:49 np0005548916 systemd[132242]: Closed D-Bus User Message Bus Socket.
Dec  6 04:50:49 np0005548916 systemd[132242]: Stopped Create User's Volatile Files and Directories.
Dec  6 04:50:49 np0005548916 systemd[132242]: Removed slice User Application Slice.
Dec  6 04:50:49 np0005548916 systemd[132242]: Reached target Shutdown.
Dec  6 04:50:49 np0005548916 systemd[132242]: Finished Exit the Session.
Dec  6 04:50:49 np0005548916 systemd[132242]: Reached target Exit the Session.
Dec  6 04:50:49 np0005548916 systemd[1]: user@0.service: Deactivated successfully.
Dec  6 04:50:49 np0005548916 systemd[1]: Stopped User Manager for UID 0.
Dec  6 04:50:49 np0005548916 systemd[1]: Stopping User Runtime Directory /run/user/0...
Dec  6 04:50:49 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:49 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8364002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:49 np0005548916 systemd[1]: run-user-0.mount: Deactivated successfully.
Dec  6 04:50:49 np0005548916 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Dec  6 04:50:49 np0005548916 systemd[1]: Stopped User Runtime Directory /run/user/0.
Dec  6 04:50:49 np0005548916 systemd[1]: Removed slice User Slice of UID 0.
Dec  6 04:50:49 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:49 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f836c001f40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:50 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:50 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8360001fc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:50 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:50:50 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:50:50 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:50:50.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:50:50 np0005548916 python3.9[133145]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:50:50 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:50:50 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:50:50 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:50:50.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:50:51 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:51 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:51 np0005548916 python3.9[133298]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:50:51 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:51 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8364002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:52 np0005548916 python3.9[133450]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:50:52 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:52 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f836c003340 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:52 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:50:52 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:50:52 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:50:52.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:50:52 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:50:52 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:50:52 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:50:52.775 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:50:52 np0005548916 python3.9[133603]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:50:52 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:50:53 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:53 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f83600032f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:53 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:53 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:53 np0005548916 python3.9[133755]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:50:54 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:54 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8364002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:54 np0005548916 python3.9[133906]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  6 04:50:54 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:50:54 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:50:54 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:50:54.551 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:50:54 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:50:54 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:50:54 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:50:54.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:50:55 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:55 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f836c003340 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:55 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:55 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f83600032f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:55 np0005548916 python3.9[134063]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Dec  6 04:50:56 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:56 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:56 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:50:56 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:50:56 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:50:56.554 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:50:56 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:50:56 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:50:56 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:50:56.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:50:57 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:57 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:57 np0005548916 python3.9[134215]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:50:57 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:57 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:57 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:50:58 np0005548916 python3.9[134336]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765014656.561391-219-187282155108088/.source follow=False _original_basename=haproxy.j2 checksum=cc5e97ea900947bff0c19d73b88d99840e041f49 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:50:58 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:58 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:58 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:50:58 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:50:58 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:50:58.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:50:58 np0005548916 python3.9[134487]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:50:58 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:50:58 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:50:58 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:50:58.786 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:50:59 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:59 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f836c004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:59 np0005548916 python3.9[134608]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765014658.2773345-264-252777079797089/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:50:59 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:59 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8364003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:00 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:00 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8360004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:00 np0005548916 python3.9[134760]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  6 04:51:00 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:51:00 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:51:00 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:51:00.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:51:00 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:51:00 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:51:00 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:51:00.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:51:01 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:01 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8364003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:01 np0005548916 python3.9[134845]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  6 04:51:01 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:01 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:02 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:02 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f836c004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:02 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:51:02 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:51:02 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:51:02.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:51:02 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:51:02 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:51:02 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:51:02.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:51:02 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:51:03 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:03 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8360004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:03 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:03 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8364003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:04 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:04 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:04 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:51:04 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:51:04 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:51:04.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:51:04 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:51:04 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:51:04 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:51:04.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:51:05 np0005548916 python3.9[135069]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec  6 04:51:05 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:05 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8364003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:05 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:05 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8364003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:05 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  6 04:51:05 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:51:05 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:51:05 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  6 04:51:06 np0005548916 python3.9[135234]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:51:06 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:06 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8360004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:06 np0005548916 python3.9[135356]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765014665.531517-375-38292714523132/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:51:06 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:51:06 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:51:06 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:51:06.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:51:06 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:51:06 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:51:06 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:51:06.802 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:51:07 np0005548916 python3.9[135506]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:51:07 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:07 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8364003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:07 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:07 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f836c004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:07 np0005548916 python3.9[135652]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765014666.7077394-375-213848021789615/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:51:07 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:51:08 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:08 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:08 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:51:08 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:51:08 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:51:08.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:51:08 np0005548916 ovn_controller[132199]: 2025-12-06T09:51:08Z|00025|memory|INFO|16128 kB peak resident set size after 29.8 seconds
Dec  6 04:51:08 np0005548916 ovn_controller[132199]: 2025-12-06T09:51:08Z|00026|memory|INFO|idl-cells-OVN_Southbound:273 idl-cells-Open_vSwitch:642 ofctrl_desired_flow_usage-KB:7 ofctrl_installed_flow_usage-KB:5 ofctrl_sb_flow_ref_usage-KB:3
Dec  6 04:51:08 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:51:08 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:51:08 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:51:08.805 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:51:08 np0005548916 podman[135679]: 2025-12-06 09:51:08.819315723 +0000 UTC m=+0.113505103 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec  6 04:51:09 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:09 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8360004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:09 np0005548916 python3.9[135830]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:51:09 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:09 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8358000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:09 np0005548916 python3.9[135951]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765014668.86935-507-129788869047860/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=ca7d4d155f5b812fab1a3b70e34adb495d291b8d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:51:10 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:10 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8364003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:10 np0005548916 python3.9[136102]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:51:10 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:51:10 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:51:10 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:51:10.578 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:51:10 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:51:10 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:51:10 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:51:10.809 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:51:11 np0005548916 python3.9[136248]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765014670.0787635-507-57883399109260/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=a14d6b38898a379cd37fc0bf365d17f10859446f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:51:11 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:11 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:11 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:51:11 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:51:11 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:11 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:11 np0005548916 python3.9[136398]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  6 04:51:12 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:12 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f83580016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:12 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:51:12 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:51:12 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:51:12.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:51:12 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:51:12 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:51:12 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:51:12.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:51:12 np0005548916 python3.9[136553]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:51:12 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:51:13 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:13 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8364003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:13 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:13 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8360004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:13 np0005548916 python3.9[136705]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:51:14 np0005548916 python3.9[136783]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:51:14 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:14 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:14 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:51:14 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:51:14 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:51:14.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:51:14 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:51:14 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:51:14 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:51:14.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:51:14 np0005548916 python3.9[136936]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:51:15 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:15 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f83580016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:15 np0005548916 python3.9[137014]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:51:15 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:15 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8364003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:16 np0005548916 python3.9[137166]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:51:16 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:16 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8360004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:16 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:51:16 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:51:16 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:51:16.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:51:16 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:51:16 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:51:16 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:51:16.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:51:17 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:17 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:17 np0005548916 python3.9[137319]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:51:17 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:17 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f83580016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:17 np0005548916 python3.9[137397]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:51:17 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:51:18 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:18 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8364003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:18 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:51:18 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:51:18 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:51:18.611 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:51:18 np0005548916 python3.9[137550]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:51:18 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:51:18 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:51:18 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:51:18.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:51:19 np0005548916 python3.9[137628]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:51:19 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:19 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8360004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:19 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:19 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:19 np0005548916 python3.9[137780]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  6 04:51:20 np0005548916 systemd[1]: Reloading.
Dec  6 04:51:20 np0005548916 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:51:20 np0005548916 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:51:20 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:20 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:20 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:51:20 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:51:20 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:51:20.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:51:20 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:51:20 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:51:20 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:51:20.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:51:21 np0005548916 python3.9[137971]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:51:21 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:21 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8358002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:21 np0005548916 python3.9[138049]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:51:21 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:21 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8360004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:22 np0005548916 python3.9[138201]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:51:22 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:22 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8364003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:22 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:51:22 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:51:22 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:51:22.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:51:22 np0005548916 python3.9[138280]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:51:22 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:51:22 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:51:22 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:51:22.831 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:51:22 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:51:23 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:23 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:23 np0005548916 python3.9[138432]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  6 04:51:23 np0005548916 systemd[1]: Reloading.
Dec  6 04:51:23 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:23 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8358002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:23 np0005548916 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:51:23 np0005548916 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:51:23 np0005548916 systemd[1]: Starting Create netns directory...
Dec  6 04:51:23 np0005548916 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec  6 04:51:23 np0005548916 systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec  6 04:51:23 np0005548916 systemd[1]: Finished Create netns directory.
Dec  6 04:51:24 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:24 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8360004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:24 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:51:24 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:51:24 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:51:24.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:51:24 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:51:24 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:51:24 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:51:24.834 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:51:24 np0005548916 python3.9[138627]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:51:25 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:25 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8364003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:25 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:25 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:25 np0005548916 python3.9[138779]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:51:26 np0005548916 python3.9[138902]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765014685.1568-960-109428248298576/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:51:26 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:26 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8358002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:26 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:51:26 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:51:26 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:51:26.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:51:26 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:51:26 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:51:26 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:51:26.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:51:27 np0005548916 python3.9[139055]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:51:27 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:27 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8360004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:27 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:27 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8364003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:27 np0005548916 python3.9[139233]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:51:27 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:51:28 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:28 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:28 np0005548916 python3.9[139357]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1765014687.465574-1035-47698695027449/.source.json _original_basename=._q1aj5vy follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:51:28 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:51:28 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:51:28 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:51:28.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:51:28 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:51:28 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:51:28 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:51:28.841 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:51:29 np0005548916 python3.9[139509]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:51:29 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:29 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8388000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:29 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:29 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8360004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:30 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:30 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8364003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:30 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:51:30 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:51:30 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:51:30.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:51:30 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:51:30 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:51:30 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:51:30.846 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:51:31 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:31 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8388000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:31 np0005548916 python3.9[139937]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False
Dec  6 04:51:31 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:31 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8360004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:32 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:32 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:32 np0005548916 python3.9[140090]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec  6 04:51:32 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:51:32 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:51:32 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:51:32.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:51:32 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:51:32 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:51:32 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:51:32.849 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:51:32 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:51:33 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:33 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8364003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:33 np0005548916 python3.9[140242]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Dec  6 04:51:33 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:33 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8388002160 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:34 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:34 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8360004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:34 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:51:34 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:51:34 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:51:34.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:51:34 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:51:34 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:51:34 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:51:34.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:51:35 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:35 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:35 np0005548916 python3[140423]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Dec  6 04:51:35 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:35 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8364003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:36 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:36 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8388002160 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:36 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:51:36 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:51:36 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:51:36.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:51:36 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:51:36 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:51:36 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:51:36.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:51:37 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:37 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8360004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:37 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:37 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:37 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:51:38 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:38 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8364003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:38 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:51:38 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:51:38 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:51:38.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:51:38 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:51:38 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:51:38 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:51:38.856 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:51:39 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:39 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8388002180 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:39 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:39 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8360004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:40 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:40 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:40 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:51:40 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:51:40 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:51:40.648 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:51:40 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:51:40 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:51:40 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:51:40.859 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:51:41 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:41 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8364003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:41 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:41 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f836c0023c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:42 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:42 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8360004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:42 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:51:42 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:51:42 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:51:42.652 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:51:42 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:51:42 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:51:42 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:51:42.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:51:42 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:51:43 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:43 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:43 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:43 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8364003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:44 np0005548916 podman[140501]: 2025-12-06 09:51:44.083467935 +0000 UTC m=+4.931364352 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Dec  6 04:51:44 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:44 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f836c002ec0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:44 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:51:44 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:51:44 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:51:44.655 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:51:44 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:51:44 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:51:44 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:51:44.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:51:45 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:45 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8360004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:45 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:45 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:46 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:46 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8364003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:46 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:51:46 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:51:46 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:51:46.659 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:51:46 np0005548916 podman[140435]: 2025-12-06 09:51:46.86485706 +0000 UTC m=+11.329815854 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3
Dec  6 04:51:46 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:51:46 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:51:46 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:51:46.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:51:47 np0005548916 podman[140598]: 2025-12-06 09:51:47.047940419 +0000 UTC m=+0.068703644 container create 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec  6 04:51:47 np0005548916 podman[140598]: 2025-12-06 09:51:47.013107445 +0000 UTC m=+0.033870730 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3
Dec  6 04:51:47 np0005548916 python3[140423]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3
Dec  6 04:51:47 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:47 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f836c002ec0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:47 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:47 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8360004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:47 np0005548916 python3.9[140813]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  6 04:51:47 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:51:48 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:48 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:48 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:51:48 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:51:48 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:51:48.663 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:51:48 np0005548916 python3.9[140968]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:51:48 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:51:48 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:51:48 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:51:48.868 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:51:49 np0005548916 python3.9[141044]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  6 04:51:49 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:49 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8364003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:49 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:49 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f836c0018c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:49 np0005548916 python3.9[141195]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765014709.207118-1299-223677700029897/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:51:50 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:50 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8360004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:50 np0005548916 python3.9[141271]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec  6 04:51:50 np0005548916 systemd[1]: Reloading.
Dec  6 04:51:50 np0005548916 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:51:50 np0005548916 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:51:50 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:51:50 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 04:51:50 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:51:50.667 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 04:51:50 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:51:50 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:51:50 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:51:50.871 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:51:51 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:51 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:51 np0005548916 python3.9[141383]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  6 04:51:51 np0005548916 systemd[1]: Reloading.
Dec  6 04:51:51 np0005548916 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:51:51 np0005548916 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:51:51 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:51 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8364003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:51 np0005548916 systemd[1]: Starting ovn_metadata_agent container...
Dec  6 04:51:51 np0005548916 systemd[1]: Started libcrun container.
Dec  6 04:51:51 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bbf687c04257492bf54ae160cdeb8f8c130ac17bc1e26ca1c1d96f233206af59/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Dec  6 04:51:51 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bbf687c04257492bf54ae160cdeb8f8c130ac17bc1e26ca1c1d96f233206af59/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec  6 04:51:51 np0005548916 systemd[1]: Started /usr/bin/podman healthcheck run 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b.
Dec  6 04:51:51 np0005548916 podman[141425]: 2025-12-06 09:51:51.983987137 +0000 UTC m=+0.182734561 container init 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS)
Dec  6 04:51:52 np0005548916 ovn_metadata_agent[141441]: + sudo -E kolla_set_configs
Dec  6 04:51:52 np0005548916 podman[141425]: 2025-12-06 09:51:52.015956319 +0000 UTC m=+0.214703713 container start 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec  6 04:51:52 np0005548916 edpm-start-podman-container[141425]: ovn_metadata_agent
Dec  6 04:51:52 np0005548916 ovn_metadata_agent[141441]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec  6 04:51:52 np0005548916 ovn_metadata_agent[141441]: INFO:__main__:Validating config file
Dec  6 04:51:52 np0005548916 ovn_metadata_agent[141441]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec  6 04:51:52 np0005548916 ovn_metadata_agent[141441]: INFO:__main__:Copying service configuration files
Dec  6 04:51:52 np0005548916 ovn_metadata_agent[141441]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Dec  6 04:51:52 np0005548916 ovn_metadata_agent[141441]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Dec  6 04:51:52 np0005548916 ovn_metadata_agent[141441]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Dec  6 04:51:52 np0005548916 ovn_metadata_agent[141441]: INFO:__main__:Writing out command to execute
Dec  6 04:51:52 np0005548916 ovn_metadata_agent[141441]: INFO:__main__:Setting permission for /var/lib/neutron
Dec  6 04:51:52 np0005548916 ovn_metadata_agent[141441]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Dec  6 04:51:52 np0005548916 ovn_metadata_agent[141441]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Dec  6 04:51:52 np0005548916 ovn_metadata_agent[141441]: INFO:__main__:Setting permission for /var/lib/neutron/external
Dec  6 04:51:52 np0005548916 ovn_metadata_agent[141441]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Dec  6 04:51:52 np0005548916 ovn_metadata_agent[141441]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Dec  6 04:51:52 np0005548916 ovn_metadata_agent[141441]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Dec  6 04:51:52 np0005548916 edpm-start-podman-container[141424]: Creating additional drop-in dependency for "ovn_metadata_agent" (4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b)
Dec  6 04:51:52 np0005548916 podman[141448]: 2025-12-06 09:51:52.096931807 +0000 UTC m=+0.069438773 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Dec  6 04:51:52 np0005548916 ovn_metadata_agent[141441]: ++ cat /run_command
Dec  6 04:51:52 np0005548916 ovn_metadata_agent[141441]: + CMD=neutron-ovn-metadata-agent
Dec  6 04:51:52 np0005548916 ovn_metadata_agent[141441]: + ARGS=
Dec  6 04:51:52 np0005548916 ovn_metadata_agent[141441]: + sudo kolla_copy_cacerts
Dec  6 04:51:52 np0005548916 systemd[1]: Reloading.
Dec  6 04:51:52 np0005548916 ovn_metadata_agent[141441]: + [[ ! -n '' ]]
Dec  6 04:51:52 np0005548916 ovn_metadata_agent[141441]: + . kolla_extend_start
Dec  6 04:51:52 np0005548916 ovn_metadata_agent[141441]: Running command: 'neutron-ovn-metadata-agent'
Dec  6 04:51:52 np0005548916 ovn_metadata_agent[141441]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\'''
Dec  6 04:51:52 np0005548916 ovn_metadata_agent[141441]: + umask 0022
Dec  6 04:51:52 np0005548916 ovn_metadata_agent[141441]: + exec neutron-ovn-metadata-agent
Dec  6 04:51:52 np0005548916 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:51:52 np0005548916 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:51:52 np0005548916 systemd[1]: Started ovn_metadata_agent container.
Dec  6 04:51:52 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:52 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f836c0018c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:52 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:51:52 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:51:52 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:51:52.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:51:52 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:51:52 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:51:52 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:51:52.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:51:52 np0005548916 systemd[1]: session-51.scope: Deactivated successfully.
Dec  6 04:51:52 np0005548916 systemd[1]: session-51.scope: Consumed 1min 2.948s CPU time.
Dec  6 04:51:52 np0005548916 systemd-logind[788]: Session 51 logged out. Waiting for processes to exit.
Dec  6 04:51:52 np0005548916 systemd-logind[788]: Removed session 51.
Dec  6 04:51:52 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:51:53 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:53 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8360004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:53 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:53 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.207 141446 INFO neutron.common.config [-] Logging enabled!#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.207 141446 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 22.2.2.dev43#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.208 141446 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.9/site-packages/neutron/common/config.py:123#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.208 141446 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.208 141446 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.208 141446 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.209 141446 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.209 141446 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.209 141446 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.209 141446 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.209 141446 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.209 141446 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.209 141446 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.209 141446 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.209 141446 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.210 141446 DEBUG neutron.agent.ovn.metadata_agent [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.210 141446 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.210 141446 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.210 141446 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.210 141446 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.210 141446 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.210 141446 DEBUG neutron.agent.ovn.metadata_agent [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.210 141446 DEBUG neutron.agent.ovn.metadata_agent [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.211 141446 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.211 141446 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.211 141446 DEBUG neutron.agent.ovn.metadata_agent [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.211 141446 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.211 141446 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.211 141446 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.211 141446 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.211 141446 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.211 141446 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.211 141446 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.211 141446 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.212 141446 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.212 141446 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.212 141446 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.212 141446 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.212 141446 DEBUG neutron.agent.ovn.metadata_agent [-] host                           = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.212 141446 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.212 141446 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.212 141446 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.213 141446 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.213 141446 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.213 141446 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.213 141446 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.213 141446 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.213 141446 DEBUG neutron.agent.ovn.metadata_agent [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.213 141446 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.213 141446 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.213 141446 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.213 141446 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.213 141446 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.214 141446 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.214 141446 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.214 141446 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.214 141446 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.214 141446 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.214 141446 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.214 141446 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.214 141446 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.214 141446 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.214 141446 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.215 141446 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.215 141446 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.215 141446 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.215 141446 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.228 141446 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.229 141446 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.229 141446 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.229 141446 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.229 141446 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.229 141446 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.229 141446 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.230 141446 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.230 141446 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.230 141446 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.230 141446 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.230 141446 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.230 141446 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.230 141446 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.230 141446 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.230 141446 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.231 141446 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.231 141446 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.231 141446 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.231 141446 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.231 141446 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.231 141446 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.231 141446 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.231 141446 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.232 141446 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.232 141446 DEBUG neutron.agent.ovn.metadata_agent [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.232 141446 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.232 141446 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.232 141446 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.232 141446 DEBUG neutron.agent.ovn.metadata_agent [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.232 141446 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.232 141446 DEBUG neutron.agent.ovn.metadata_agent [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.232 141446 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.232 141446 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.232 141446 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.233 141446 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.233 141446 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.233 141446 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.233 141446 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.233 141446 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.233 141446 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.233 141446 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.233 141446 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.234 141446 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.234 141446 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.234 141446 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.234 141446 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.234 141446 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.234 141446 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.234 141446 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.234 141446 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.234 141446 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.235 141446 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.235 141446 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.235 141446 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.235 141446 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.235 141446 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.235 141446 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.235 141446 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.235 141446 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.235 141446 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.236 141446 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.236 141446 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.236 141446 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.236 141446 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.236 141446 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.236 141446 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.236 141446 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.236 141446 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.237 141446 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.237 141446 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.237 141446 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.237 141446 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.237 141446 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.237 141446 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.237 141446 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.237 141446 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.237 141446 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.238 141446 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.238 141446 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.238 141446 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.238 141446 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.238 141446 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.238 141446 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.238 141446 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.238 141446 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.238 141446 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.239 141446 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.239 141446 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.239 141446 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.239 141446 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.239 141446 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.239 141446 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.239 141446 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.239 141446 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.239 141446 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.239 141446 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.240 141446 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.240 141446 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.240 141446 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.240 141446 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.240 141446 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.240 141446 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.240 141446 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.240 141446 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.240 141446 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.241 141446 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.241 141446 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.241 141446 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.241 141446 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.241 141446 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.241 141446 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.241 141446 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.241 141446 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.241 141446 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.241 141446 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.242 141446 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.242 141446 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.242 141446 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.242 141446 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.242 141446 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.242 141446 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.242 141446 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.242 141446 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.242 141446 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.243 141446 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.243 141446 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.243 141446 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.243 141446 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.243 141446 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.243 141446 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.243 141446 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.243 141446 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.243 141446 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.244 141446 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.244 141446 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.244 141446 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.244 141446 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.244 141446 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.244 141446 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.244 141446 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.244 141446 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.244 141446 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.245 141446 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.245 141446 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.245 141446 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.245 141446 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.245 141446 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.245 141446 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.245 141446 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.245 141446 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.245 141446 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.245 141446 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.246 141446 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.246 141446 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.246 141446 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.246 141446 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.246 141446 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.246 141446 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.246 141446 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.246 141446 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.246 141446 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.247 141446 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.247 141446 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.247 141446 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.247 141446 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.247 141446 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.247 141446 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.247 141446 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.247 141446 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.247 141446 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.248 141446 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.248 141446 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.248 141446 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.248 141446 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.248 141446 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.248 141446 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.248 141446 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.248 141446 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.248 141446 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.249 141446 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.249 141446 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.249 141446 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.249 141446 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.249 141446 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.249 141446 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.249 141446 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.249 141446 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.250 141446 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.250 141446 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.250 141446 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.250 141446 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.250 141446 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.250 141446 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.250 141446 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.251 141446 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.251 141446 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.251 141446 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.251 141446 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.251 141446 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.251 141446 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.251 141446 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.251 141446 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.251 141446 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.252 141446 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.252 141446 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.252 141446 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.252 141446 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.252 141446 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.252 141446 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.252 141446 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.252 141446 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.252 141446 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.253 141446 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.253 141446 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.253 141446 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.253 141446 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.253 141446 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.253 141446 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.253 141446 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.253 141446 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.253 141446 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.254 141446 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.254 141446 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.254 141446 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.254 141446 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.254 141446 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.254 141446 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.254 141446 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.254 141446 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.254 141446 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.255 141446 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.255 141446 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.255 141446 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.255 141446 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.255 141446 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.255 141446 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.265 141446 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.266 141446 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.266 141446 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.266 141446 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.266 141446 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.281 141446 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name 61eba479-a995-4b31-88b9-8ebfcea9907e (UUID: 61eba479-a995-4b31-88b9-8ebfcea9907e) and ovn bridge br-int. _load_config /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:309#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.306 141446 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.307 141446 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.307 141446 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.307 141446 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.310 141446 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.317 141446 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.326 141446 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', '61eba479-a995-4b31-88b9-8ebfcea9907e'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[<ovs.db.idl.Row object at 0x7f6d96f5b8b0>], external_ids={}, name=61eba479-a995-4b31-88b9-8ebfcea9907e, nb_cfg_timestamp=1765014646996, nb_cfg=1) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.327 141446 DEBUG neutron_lib.callbacks.manager [-] Subscribe: <bound method MetadataProxyHandler.post_fork_initialize of <neutron.agent.ovn.metadata.server.MetadataProxyHandler object at 0x7f6d96f4cf70>> process after_init 55550000, False subscribe /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:52#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.328 141446 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.328 141446 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.328 141446 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.329 141446 INFO oslo_service.service [-] Starting 1 workers#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.334 141446 DEBUG oslo_service.service [-] Started child 141558 _start_child /usr/lib/python3.9/site-packages/oslo_service/service.py:575#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.339 141558 DEBUG neutron_lib.callbacks.manager [-] Publish callbacks ['neutron.agent.ovn.metadata.server.MetadataProxyHandler.post_fork_initialize-2001887'] for process (None), after_init _notify_loop /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:184#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.361 141446 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmpgd8t6f9_/privsep.sock']#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.398 141558 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.399 141558 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.399 141558 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.403 141558 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.411 141558 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Dec  6 04:51:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.421 141558 INFO eventlet.wsgi.server [-] (141558) wsgi starting up on http:/var/lib/neutron/metadata_proxy#033[00m
Dec  6 04:51:54 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:54 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8364003c30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:54 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:51:54 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:51:54 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:51:54.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:51:54 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:51:54 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:51:54 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:51:54.877 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:51:54 np0005548916 kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure
Dec  6 04:51:55 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:55.119 141446 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m
Dec  6 04:51:55 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:55.120 141446 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpgd8t6f9_/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m
Dec  6 04:51:55 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.937 141563 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Dec  6 04:51:55 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.943 141563 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Dec  6 04:51:55 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.947 141563 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none#033[00m
Dec  6 04:51:55 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.947 141563 INFO oslo.privsep.daemon [-] privsep daemon running as pid 141563#033[00m
Dec  6 04:51:55 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:55.123 141563 DEBUG oslo.privsep.daemon [-] privsep: reply[2e4e6ae1-6b2a-4e54-a88e-3e4617c1dd17]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  6 04:51:55 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:55 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f836c0018c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:55 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:55 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8360004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:55 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:55.684 141563 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 04:51:55 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:55.684 141563 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 04:51:55 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:55.684 141563 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.295 141563 DEBUG oslo.privsep.daemon [-] privsep: reply[ac2e1070-6678-498d-bf82-5f3b3c28deac]: (4, []) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.298 141446 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=61eba479-a995-4b31-88b9-8ebfcea9907e, column=external_ids, values=({'neutron:ovn-metadata-id': '43f55786-9e75-56e3-ac2c-7faf4144e8c1'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.311 141446 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=61eba479-a995-4b31-88b9-8ebfcea9907e, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.319 141446 DEBUG oslo_service.service [-] Full set of CONF: wait /usr/lib/python3.9/site-packages/oslo_service/service.py:649#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.319 141446 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.320 141446 DEBUG oslo_service.service [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.320 141446 DEBUG oslo_service.service [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.320 141446 DEBUG oslo_service.service [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.320 141446 DEBUG oslo_service.service [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.320 141446 DEBUG oslo_service.service [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.320 141446 DEBUG oslo_service.service [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.320 141446 DEBUG oslo_service.service [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.321 141446 DEBUG oslo_service.service [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.321 141446 DEBUG oslo_service.service [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.321 141446 DEBUG oslo_service.service [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.321 141446 DEBUG oslo_service.service [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.321 141446 DEBUG oslo_service.service [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.321 141446 DEBUG oslo_service.service [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.321 141446 DEBUG oslo_service.service [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.321 141446 DEBUG oslo_service.service [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.322 141446 DEBUG oslo_service.service [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.322 141446 DEBUG oslo_service.service [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.322 141446 DEBUG oslo_service.service [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.322 141446 DEBUG oslo_service.service [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.322 141446 DEBUG oslo_service.service [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.322 141446 DEBUG oslo_service.service [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.322 141446 DEBUG oslo_service.service [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.322 141446 DEBUG oslo_service.service [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.323 141446 DEBUG oslo_service.service [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.323 141446 DEBUG oslo_service.service [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.323 141446 DEBUG oslo_service.service [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.323 141446 DEBUG oslo_service.service [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.323 141446 DEBUG oslo_service.service [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.323 141446 DEBUG oslo_service.service [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.323 141446 DEBUG oslo_service.service [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.324 141446 DEBUG oslo_service.service [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.324 141446 DEBUG oslo_service.service [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.324 141446 DEBUG oslo_service.service [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.324 141446 DEBUG oslo_service.service [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.324 141446 DEBUG oslo_service.service [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.325 141446 DEBUG oslo_service.service [-] host                           = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.325 141446 DEBUG oslo_service.service [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.325 141446 DEBUG oslo_service.service [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.325 141446 DEBUG oslo_service.service [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.325 141446 DEBUG oslo_service.service [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.325 141446 DEBUG oslo_service.service [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.326 141446 DEBUG oslo_service.service [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.326 141446 DEBUG oslo_service.service [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.326 141446 DEBUG oslo_service.service [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.326 141446 DEBUG oslo_service.service [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.326 141446 DEBUG oslo_service.service [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.326 141446 DEBUG oslo_service.service [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.326 141446 DEBUG oslo_service.service [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.326 141446 DEBUG oslo_service.service [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.327 141446 DEBUG oslo_service.service [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.327 141446 DEBUG oslo_service.service [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.327 141446 DEBUG oslo_service.service [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.327 141446 DEBUG oslo_service.service [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.327 141446 DEBUG oslo_service.service [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.327 141446 DEBUG oslo_service.service [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.327 141446 DEBUG oslo_service.service [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.327 141446 DEBUG oslo_service.service [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.327 141446 DEBUG oslo_service.service [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.328 141446 DEBUG oslo_service.service [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.328 141446 DEBUG oslo_service.service [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.328 141446 DEBUG oslo_service.service [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.328 141446 DEBUG oslo_service.service [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.328 141446 DEBUG oslo_service.service [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.328 141446 DEBUG oslo_service.service [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.328 141446 DEBUG oslo_service.service [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.328 141446 DEBUG oslo_service.service [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.329 141446 DEBUG oslo_service.service [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.329 141446 DEBUG oslo_service.service [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.329 141446 DEBUG oslo_service.service [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.329 141446 DEBUG oslo_service.service [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.329 141446 DEBUG oslo_service.service [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.329 141446 DEBUG oslo_service.service [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.329 141446 DEBUG oslo_service.service [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.329 141446 DEBUG oslo_service.service [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.329 141446 DEBUG oslo_service.service [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.330 141446 DEBUG oslo_service.service [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.330 141446 DEBUG oslo_service.service [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.330 141446 DEBUG oslo_service.service [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.330 141446 DEBUG oslo_service.service [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.330 141446 DEBUG oslo_service.service [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.330 141446 DEBUG oslo_service.service [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.330 141446 DEBUG oslo_service.service [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.331 141446 DEBUG oslo_service.service [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.331 141446 DEBUG oslo_service.service [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.331 141446 DEBUG oslo_service.service [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.331 141446 DEBUG oslo_service.service [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.331 141446 DEBUG oslo_service.service [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.331 141446 DEBUG oslo_service.service [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.331 141446 DEBUG oslo_service.service [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.331 141446 DEBUG oslo_service.service [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.331 141446 DEBUG oslo_service.service [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.332 141446 DEBUG oslo_service.service [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.332 141446 DEBUG oslo_service.service [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.332 141446 DEBUG oslo_service.service [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.332 141446 DEBUG oslo_service.service [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.332 141446 DEBUG oslo_service.service [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.332 141446 DEBUG oslo_service.service [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.332 141446 DEBUG oslo_service.service [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.332 141446 DEBUG oslo_service.service [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.332 141446 DEBUG oslo_service.service [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.333 141446 DEBUG oslo_service.service [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.333 141446 DEBUG oslo_service.service [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.333 141446 DEBUG oslo_service.service [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.333 141446 DEBUG oslo_service.service [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.333 141446 DEBUG oslo_service.service [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.333 141446 DEBUG oslo_service.service [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.333 141446 DEBUG oslo_service.service [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.333 141446 DEBUG oslo_service.service [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.334 141446 DEBUG oslo_service.service [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.334 141446 DEBUG oslo_service.service [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.334 141446 DEBUG oslo_service.service [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.334 141446 DEBUG oslo_service.service [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.334 141446 DEBUG oslo_service.service [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.334 141446 DEBUG oslo_service.service [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.334 141446 DEBUG oslo_service.service [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.335 141446 DEBUG oslo_service.service [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.335 141446 DEBUG oslo_service.service [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.335 141446 DEBUG oslo_service.service [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.335 141446 DEBUG oslo_service.service [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.335 141446 DEBUG oslo_service.service [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.336 141446 DEBUG oslo_service.service [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.336 141446 DEBUG oslo_service.service [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.336 141446 DEBUG oslo_service.service [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.336 141446 DEBUG oslo_service.service [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.336 141446 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.336 141446 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.336 141446 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.337 141446 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.337 141446 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.337 141446 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.337 141446 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.337 141446 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.337 141446 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.338 141446 DEBUG oslo_service.service [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.338 141446 DEBUG oslo_service.service [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.338 141446 DEBUG oslo_service.service [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.338 141446 DEBUG oslo_service.service [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.338 141446 DEBUG oslo_service.service [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.338 141446 DEBUG oslo_service.service [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.338 141446 DEBUG oslo_service.service [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.339 141446 DEBUG oslo_service.service [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.339 141446 DEBUG oslo_service.service [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.339 141446 DEBUG oslo_service.service [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.339 141446 DEBUG oslo_service.service [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.339 141446 DEBUG oslo_service.service [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.339 141446 DEBUG oslo_service.service [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.339 141446 DEBUG oslo_service.service [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.340 141446 DEBUG oslo_service.service [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.340 141446 DEBUG oslo_service.service [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.340 141446 DEBUG oslo_service.service [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.340 141446 DEBUG oslo_service.service [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.340 141446 DEBUG oslo_service.service [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.340 141446 DEBUG oslo_service.service [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.340 141446 DEBUG oslo_service.service [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.340 141446 DEBUG oslo_service.service [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.340 141446 DEBUG oslo_service.service [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.341 141446 DEBUG oslo_service.service [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.341 141446 DEBUG oslo_service.service [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.341 141446 DEBUG oslo_service.service [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.341 141446 DEBUG oslo_service.service [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.341 141446 DEBUG oslo_service.service [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.341 141446 DEBUG oslo_service.service [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.341 141446 DEBUG oslo_service.service [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.341 141446 DEBUG oslo_service.service [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.341 141446 DEBUG oslo_service.service [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.341 141446 DEBUG oslo_service.service [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.342 141446 DEBUG oslo_service.service [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.342 141446 DEBUG oslo_service.service [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.342 141446 DEBUG oslo_service.service [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.342 141446 DEBUG oslo_service.service [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.342 141446 DEBUG oslo_service.service [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.342 141446 DEBUG oslo_service.service [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.342 141446 DEBUG oslo_service.service [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.343 141446 DEBUG oslo_service.service [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.343 141446 DEBUG oslo_service.service [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.343 141446 DEBUG oslo_service.service [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.343 141446 DEBUG oslo_service.service [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.343 141446 DEBUG oslo_service.service [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.343 141446 DEBUG oslo_service.service [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.343 141446 DEBUG oslo_service.service [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.344 141446 DEBUG oslo_service.service [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.344 141446 DEBUG oslo_service.service [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.344 141446 DEBUG oslo_service.service [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.344 141446 DEBUG oslo_service.service [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.344 141446 DEBUG oslo_service.service [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.344 141446 DEBUG oslo_service.service [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.345 141446 DEBUG oslo_service.service [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.345 141446 DEBUG oslo_service.service [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.345 141446 DEBUG oslo_service.service [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.345 141446 DEBUG oslo_service.service [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.345 141446 DEBUG oslo_service.service [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.345 141446 DEBUG oslo_service.service [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.345 141446 DEBUG oslo_service.service [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.345 141446 DEBUG oslo_service.service [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.346 141446 DEBUG oslo_service.service [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.346 141446 DEBUG oslo_service.service [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.346 141446 DEBUG oslo_service.service [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.346 141446 DEBUG oslo_service.service [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.346 141446 DEBUG oslo_service.service [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.346 141446 DEBUG oslo_service.service [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.346 141446 DEBUG oslo_service.service [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.347 141446 DEBUG oslo_service.service [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.347 141446 DEBUG oslo_service.service [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.347 141446 DEBUG oslo_service.service [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.347 141446 DEBUG oslo_service.service [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.347 141446 DEBUG oslo_service.service [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.347 141446 DEBUG oslo_service.service [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.347 141446 DEBUG oslo_service.service [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.348 141446 DEBUG oslo_service.service [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.348 141446 DEBUG oslo_service.service [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.348 141446 DEBUG oslo_service.service [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.348 141446 DEBUG oslo_service.service [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.348 141446 DEBUG oslo_service.service [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.348 141446 DEBUG oslo_service.service [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.348 141446 DEBUG oslo_service.service [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.348 141446 DEBUG oslo_service.service [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.348 141446 DEBUG oslo_service.service [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.349 141446 DEBUG oslo_service.service [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.349 141446 DEBUG oslo_service.service [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.349 141446 DEBUG oslo_service.service [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.349 141446 DEBUG oslo_service.service [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.349 141446 DEBUG oslo_service.service [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.349 141446 DEBUG oslo_service.service [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.349 141446 DEBUG oslo_service.service [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.349 141446 DEBUG oslo_service.service [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.349 141446 DEBUG oslo_service.service [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.350 141446 DEBUG oslo_service.service [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.350 141446 DEBUG oslo_service.service [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.350 141446 DEBUG oslo_service.service [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.350 141446 DEBUG oslo_service.service [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.350 141446 DEBUG oslo_service.service [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.350 141446 DEBUG oslo_service.service [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.350 141446 DEBUG oslo_service.service [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.350 141446 DEBUG oslo_service.service [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.350 141446 DEBUG oslo_service.service [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.351 141446 DEBUG oslo_service.service [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.351 141446 DEBUG oslo_service.service [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.351 141446 DEBUG oslo_service.service [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.351 141446 DEBUG oslo_service.service [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.351 141446 DEBUG oslo_service.service [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.351 141446 DEBUG oslo_service.service [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.351 141446 DEBUG oslo_service.service [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.352 141446 DEBUG oslo_service.service [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.352 141446 DEBUG oslo_service.service [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.352 141446 DEBUG oslo_service.service [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.352 141446 DEBUG oslo_service.service [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.352 141446 DEBUG oslo_service.service [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.352 141446 DEBUG oslo_service.service [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.352 141446 DEBUG oslo_service.service [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.352 141446 DEBUG oslo_service.service [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.352 141446 DEBUG oslo_service.service [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.353 141446 DEBUG oslo_service.service [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.353 141446 DEBUG oslo_service.service [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.353 141446 DEBUG oslo_service.service [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.353 141446 DEBUG oslo_service.service [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.353 141446 DEBUG oslo_service.service [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.353 141446 DEBUG oslo_service.service [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.353 141446 DEBUG oslo_service.service [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.353 141446 DEBUG oslo_service.service [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.353 141446 DEBUG oslo_service.service [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.354 141446 DEBUG oslo_service.service [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.354 141446 DEBUG oslo_service.service [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.354 141446 DEBUG oslo_service.service [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.354 141446 DEBUG oslo_service.service [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.354 141446 DEBUG oslo_service.service [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.354 141446 DEBUG oslo_service.service [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.354 141446 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.355 141446 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.355 141446 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.355 141446 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.355 141446 DEBUG oslo_service.service [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.355 141446 DEBUG oslo_service.service [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.355 141446 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.355 141446 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.356 141446 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.356 141446 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.356 141446 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.356 141446 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.356 141446 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.356 141446 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.356 141446 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.357 141446 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.357 141446 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.357 141446 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.357 141446 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.357 141446 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.357 141446 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.357 141446 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.357 141446 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.357 141446 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.358 141446 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.358 141446 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.358 141446 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.358 141446 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.358 141446 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.358 141446 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.358 141446 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.358 141446 DEBUG oslo_service.service [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.359 141446 DEBUG oslo_service.service [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.359 141446 DEBUG oslo_service.service [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.359 141446 DEBUG oslo_service.service [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.359 141446 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Dec  6 04:51:56 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:56 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:56 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:51:56 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:51:56 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:51:56.679 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:51:56 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:51:56 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:51:56 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:51:56.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:51:57 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:57 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8364003c50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:57 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:57 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f836c001a60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:57 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:51:58 np0005548916 systemd-logind[788]: New session 52 of user zuul.
Dec  6 04:51:58 np0005548916 systemd[1]: Started Session 52 of User zuul.
Dec  6 04:51:58 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:58 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8360004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:58 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:51:58 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:51:58 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:51:58.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:51:58 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:51:58 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:51:58 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:51:58.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:51:59 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:59 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:59 np0005548916 python3.9[141723]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  6 04:51:59 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:59 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8364003c70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:00 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:00 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8358002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:00 np0005548916 python3.9[141881]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:52:00 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:52:00 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 04:52:00 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:52:00.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 04:52:00 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:52:00 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:52:00 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:52:00.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:52:01 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:01 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8360004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:01 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:01 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:02 np0005548916 python3.9[142046]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec  6 04:52:02 np0005548916 systemd[1]: Reloading.
Dec  6 04:52:02 np0005548916 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:52:02 np0005548916 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:52:02 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:02 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8364003c70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:02 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:52:02 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:52:02 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:52:02.689 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:52:02 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:52:02 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:52:02 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:52:02.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:52:02 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:52:03 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:03 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8358002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:03 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:03 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8360004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:03 np0005548916 python3.9[142232]: ansible-ansible.builtin.service_facts Invoked
Dec  6 04:52:03 np0005548916 network[142249]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec  6 04:52:03 np0005548916 network[142250]: 'network-scripts' will be removed from distribution in near future.
Dec  6 04:52:03 np0005548916 network[142251]: It is advised to switch to 'NetworkManager' instead for network management.
Dec  6 04:52:04 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:04 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:04 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:52:04 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:52:04 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:52:04.692 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:52:04 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:52:04 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 04:52:04 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:52:04.906 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 04:52:05 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:05 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8364003c90 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:05 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:05 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8358002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:06 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:06 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8360004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:06 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:52:06 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:52:06 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:52:06.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:52:06 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:52:06 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:52:06 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:52:06.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:52:07 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:07 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8364003c90 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:07 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:07 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:07 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:52:08 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:08 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8358002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:08 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:52:08 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:52:08 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:52:08.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:52:08 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:52:08 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 04:52:08 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:52:08.913 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 04:52:09 np0005548916 python3.9[142541]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  6 04:52:09 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:09 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8360004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:09 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/095209 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  6 04:52:09 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:09 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8364003cb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:09 np0005548916 python3.9[142694]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  6 04:52:10 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:10 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:10 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:52:10 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:52:10 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:52:10.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:52:10 np0005548916 python3.9[142848]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  6 04:52:10 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:52:10 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:52:10 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:52:10.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:52:11 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:11 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8358002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:11 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:11 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8360004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:11 np0005548916 python3.9[143085]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  6 04:52:12 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:12 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8388002180 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:12 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:52:12 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:52:12 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:52:12.706 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:52:12 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:52:12 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:52:12 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:52:12.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:52:13 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:52:13 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:13 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8364003cd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:13 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  6 04:52:13 np0005548916 python3.9[143239]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  6 04:52:13 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:13 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8358003490 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:14 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:14 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8358003490 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:14 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:52:14 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:52:14 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  6 04:52:14 np0005548916 python3.9[143394]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  6 04:52:14 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:52:14 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.003000075s ======
Dec  6 04:52:14 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:52:14.708 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.003000075s
Dec  6 04:52:14 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:52:14 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:52:14 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:52:14.921 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:52:15 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:15 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8388002180 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:15 np0005548916 python3.9[143547]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  6 04:52:15 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:15 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8364003cf0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:15 np0005548916 podman[143573]: 2025-12-06 09:52:15.989982639 +0000 UTC m=+0.285705204 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3)
Dec  6 04:52:16 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:16 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8358003490 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:16 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:52:16 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:52:16 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:52:16.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:52:16 np0005548916 python3.9[143728]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:52:16 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:52:16 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:52:16 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:52:16.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:52:17 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:17 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8360004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:17 np0005548916 python3.9[143880]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:52:17 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:17 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8360004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:18 np0005548916 python3.9[144032]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:52:18 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:18 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 04:52:18 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:52:18 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:18 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8364003d10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:18 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:52:18 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:52:18 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:52:18.719 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:52:18 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:52:18 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:52:18 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:52:18.927 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:52:18 np0005548916 python3.9[144185]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:52:19 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:19 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8358003490 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:19 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:19 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8388001380 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:19 np0005548916 python3.9[144337]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:52:20 np0005548916 python3.9[144489]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:52:20 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:20 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8360004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:20 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:52:20 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:52:20 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:52:20.723 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:52:20 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:52:20 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:52:20 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:52:20.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:52:21 np0005548916 python3.9[144642]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:52:21 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:21 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 04:52:21 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:21 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 04:52:21 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:21 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8364003d30 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:21 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:21 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8388001380 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:22 np0005548916 podman[144766]: 2025-12-06 09:52:22.240349389 +0000 UTC m=+0.066446878 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec  6 04:52:22 np0005548916 python3.9[144809]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:52:22 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:22 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8388001380 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:22 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:52:22 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:52:22 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:52:22.726 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:52:22 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:52:22 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:52:22 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:52:22.934 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:52:23 np0005548916 python3.9[144968]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:52:23 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:52:23 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:23 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8388001380 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:23 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:23 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8364003d50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:23 np0005548916 python3.9[145120]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:52:24 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:24 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  6 04:52:24 np0005548916 python3.9[145273]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:52:24 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:24 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f83580041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:24 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:52:24 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:52:24 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:52:24.730 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:52:24 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:52:24 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:52:24 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:52:24.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:52:25 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:52:25 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:52:25 np0005548916 python3.9[145450]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:52:25 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:25 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8360004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:25 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:25 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8388009990 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:25 np0005548916 python3.9[145602]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:52:26 np0005548916 python3.9[145755]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:52:26 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:26 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8364003d70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:26 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:52:26 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:52:26 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:52:26.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:52:26 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:52:26 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 04:52:26 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:52:26.939 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 04:52:27 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:27 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f83580041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:27 np0005548916 python3.9[145907]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012  systemctl disable --now certmonger.service#012  test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:52:27 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:27 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8360004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:28 np0005548916 python3.9[146084]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec  6 04:52:28 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:52:28 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:28 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8388009990 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:28 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:52:28 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:52:28 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:52:28.737 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:52:28 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:52:28 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:52:28 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:52:28.942 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:52:29 np0005548916 python3.9[146237]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec  6 04:52:29 np0005548916 systemd[1]: Reloading.
Dec  6 04:52:29 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:29 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8364003d90 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:29 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/095229 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  6 04:52:29 np0005548916 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:52:29 np0005548916 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:52:29 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:29 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f83580041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:30 np0005548916 python3.9[146425]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:52:30 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:30 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8360004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:30 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:52:30 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:52:30 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:52:30.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:52:30 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:52:30 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:52:30 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:52:30.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:52:31 np0005548916 python3.9[146579]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:52:31 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:31 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f838800a2b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:31 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:31 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f836c002ad0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:31 np0005548916 python3.9[146732]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:52:32 np0005548916 python3.9[146885]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:52:32 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:32 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f83580041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:32 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:52:32 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:52:32 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:52:32.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:52:32 np0005548916 python3.9[147039]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:52:32 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:52:32 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:52:32 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:52:32.948 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:52:33 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:52:33 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:33 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8360004020 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:33 np0005548916 python3.9[147192]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:52:33 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:33 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f838800a2b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:34 np0005548916 python3.9[147345]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:52:34 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:34 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f836c002ad0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:34 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:52:34 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:52:34 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:52:34.748 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:52:34 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:52:34 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:52:34 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:52:34.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:52:35 np0005548916 python3.9[147499]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None
Dec  6 04:52:35 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:35 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f83580041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:35 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:35 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8360004040 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:36 np0005548916 python3.9[147652]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec  6 04:52:36 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:36 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f838800a2b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:36 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:52:36 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:52:36 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:52:36.752 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:52:36 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:52:36 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:52:36 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:52:36.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:52:37 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:37 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f836c002ad0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:37 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:37 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f83580041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:37 np0005548916 python3.9[147811]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Dec  6 04:52:37 np0005548916 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec  6 04:52:37 np0005548916 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec  6 04:52:38 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:52:38 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:38 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8360004060 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:38 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:52:38 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:52:38 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:52:38.756 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:52:38 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:52:38 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:52:38 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:52:38.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:52:39 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:39 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8360004060 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:39 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:39 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8360004060 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:40 np0005548916 python3.9[147973]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  6 04:52:40 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:40 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f83580041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:40 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:52:40 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:52:40 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:52:40.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:52:40 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:52:40 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:52:40 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:52:40.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:52:41 np0005548916 python3.9[148058]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  6 04:52:41 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:41 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f838800a2b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:41 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:41 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f838800a2b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:42 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:42 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8360004060 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:42 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:52:42 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:52:42 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:52:42.763 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:52:42 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:52:42 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:52:42 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:52:42.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:52:43 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:52:43 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:43 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f83580041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:43 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:43 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c001080 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:44 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:44 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f838800a2b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:44 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:52:44 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:52:44 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:52:44.771 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:52:44 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:52:44 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:52:44 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:52:44.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:52:45 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:45 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8360004060 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:45 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:45 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f83580041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:46 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:46 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f838800a2b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:46 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:52:46 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:52:46 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:52:46.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:52:46 np0005548916 podman[148073]: 2025-12-06 09:52:46.866274581 +0000 UTC m=+0.153602119 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible)
Dec  6 04:52:46 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:52:46 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:52:46 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:52:46.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:52:47 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:47 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c001080 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:47 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:47 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8360004060 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:48 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:52:48 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:48 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f83580041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:48 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:52:48 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:52:48 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:52:48.784 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:52:48 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:52:48 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:52:48 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:52:48.975 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:52:49 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:49 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f838800a2b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:49 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:49 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c002730 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:50 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:50 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8360004060 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:50 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:52:50 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:52:50 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:52:50.788 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:52:50 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:52:50 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:52:50 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:52:50.977 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:52:51 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:51 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f838800a2b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:51 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:51 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f83580041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:52 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:52 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f83580041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:52 np0005548916 podman[148245]: 2025-12-06 09:52:52.747101651 +0000 UTC m=+0.054547298 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Dec  6 04:52:52 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:52:52 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:52:52 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:52:52.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:52:52 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:52:52 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:52:52 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:52:52.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:52:53 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:52:53 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:53 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8360004060 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:53 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:53 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f838800a2b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:52:54.260 141446 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 04:52:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:52:54.263 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.004s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 04:52:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:52:54.263 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 04:52:54 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:54 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c003050 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:54 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:52:54 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:52:54 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:52:54.801 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:52:54 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:52:54 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:52:54 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:52:54.983 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:52:55 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:55 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f83580041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:55 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:55 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8360004060 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:56 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:56 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f838800a2b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:56 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:52:56 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:52:56 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:52:56.805 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:52:56 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:52:56 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:52:56 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:52:56.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:52:57 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:57 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c003050 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:57 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:57 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f83580041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:58 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:52:58 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:58 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8360004200 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:58 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:52:58 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:52:58 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:52:58.809 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:52:58 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:52:58 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:52:58 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:52:58.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:52:59 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:59 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f838800a2b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:59 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:59 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c003050 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:00 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:00 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f83580041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:00 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:53:00 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:53:00 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:53:00.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:53:00 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:53:00 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:53:00 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:53:00.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:53:01 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:01 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8360004220 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:01 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:01 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f838800a2b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:02 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:02 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c0043a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:02 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:53:02 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:53:02 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:53:02.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:53:02 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:53:02 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:53:02 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:53:02.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:53:03 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:53:03 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:03 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8364002690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:03 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:03 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f83580041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:04 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:04 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f838800a2b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:04 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:53:04 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.002000051s ======
Dec  6 04:53:04 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:53:04.820 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000051s
Dec  6 04:53:05 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:53:05 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:53:05 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:53:04.999 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:53:05 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:05 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c0043a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:05 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:05 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8364002690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:06 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:06 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f83580041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:06 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:53:06 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:53:06 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:53:06.825 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:53:07 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:53:07 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:53:07 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:53:07.002 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:53:07 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:07 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f838800a2b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:07 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:07 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c0043a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:08 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:53:08 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:08 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8364002690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:08 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:53:08 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:53:08 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:53:08.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:53:09 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:53:09 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:53:09 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:53:09.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:53:09 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:09 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f83580041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:09 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:09 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f838800a2b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:10 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:10 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c0043a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:10 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:53:10 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:53:10 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:53:10.832 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:53:10 np0005548916 kernel: SELinux:  Converting 2773 SID table entries...
Dec  6 04:53:10 np0005548916 kernel: SELinux:  policy capability network_peer_controls=1
Dec  6 04:53:10 np0005548916 kernel: SELinux:  policy capability open_perms=1
Dec  6 04:53:10 np0005548916 kernel: SELinux:  policy capability extended_socket_class=1
Dec  6 04:53:10 np0005548916 kernel: SELinux:  policy capability always_check_network=0
Dec  6 04:53:10 np0005548916 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec  6 04:53:10 np0005548916 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec  6 04:53:10 np0005548916 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec  6 04:53:11 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:53:11 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:53:11 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:53:11.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:53:11 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:11 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8364002690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:11 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:11 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f83580041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:12 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:12 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f838800a2b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:12 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:53:12 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:53:12 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:53:12.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:53:13 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:53:13 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:53:13 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:53:13.011 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:53:13 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:53:13 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:13 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c0043a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:13 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:13 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8364003430 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:14 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:14 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f83580041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:14 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:53:14 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:53:14 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:53:14.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:53:15 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:53:15 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:53:15 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:53:15.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:53:15 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:15 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f838800a2b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:15 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:15 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f836c002ad0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:16 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:16 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8364003430 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:16 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:53:16 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:53:16 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:53:16.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:53:17 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:53:17 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:53:17 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:53:17.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:53:17 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:17 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f83580041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:17 np0005548916 dbus-broker-launch[774]: avc:  op=load_policy lsm=selinux seqno=12 res=1
Dec  6 04:53:17 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:17 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f838800a2b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:17 np0005548916 podman[148371]: 2025-12-06 09:53:17.854505158 +0000 UTC m=+0.139767064 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec  6 04:53:18 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:53:18 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:18 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f836c002ad0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:18 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:53:18 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:53:18 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:53:18.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:53:19 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:53:19 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:53:19 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:53:19.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:53:19 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:19 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8364003430 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:19 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:19 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f83580041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:20 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:20 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f838800a2b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:20 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:53:20 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:53:20 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:53:20.849 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:53:21 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:53:21 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:53:21 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:53:21.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:53:21 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:21 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f836c002ad0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:21 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:21 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8364003430 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:22 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:22 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f83580041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:22 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:53:22 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:53:22 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:53:22.853 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:53:23 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:53:23 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:53:23 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:53:23.026 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:53:23 np0005548916 kernel: SELinux:  Converting 2773 SID table entries...
Dec  6 04:53:23 np0005548916 kernel: SELinux:  policy capability network_peer_controls=1
Dec  6 04:53:23 np0005548916 kernel: SELinux:  policy capability open_perms=1
Dec  6 04:53:23 np0005548916 kernel: SELinux:  policy capability extended_socket_class=1
Dec  6 04:53:23 np0005548916 kernel: SELinux:  policy capability always_check_network=0
Dec  6 04:53:23 np0005548916 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec  6 04:53:23 np0005548916 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec  6 04:53:23 np0005548916 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec  6 04:53:23 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:53:23 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:23 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f838800a2b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:23 np0005548916 dbus-broker-launch[774]: avc:  op=load_policy lsm=selinux seqno=13 res=1
Dec  6 04:53:23 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:23 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f836c002ad0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:23 np0005548916 podman[148408]: 2025-12-06 09:53:23.775344381 +0000 UTC m=+0.065008354 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  6 04:53:24 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:24 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8364003430 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:24 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:53:24 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:53:24 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:53:24.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:53:25 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:53:25 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:53:25 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:53:25.029 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:53:25 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:25 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f83580041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:25 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:25 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f838800a2b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:26 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:26 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f836c002ad0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:26 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  6 04:53:26 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:53:26 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:53:26 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  6 04:53:26 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:53:26 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:53:26 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:53:26.862 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:53:27 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:53:27 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:53:27 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:53:27.034 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:53:27 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:27 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8364003430 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:27 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/095327 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  6 04:53:27 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:27 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f83580041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:28 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:53:28 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:28 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f83580041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:28 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:53:28 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:53:28 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:53:28.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:53:29 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:53:29 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:53:29 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:53:29.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:53:29 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:29 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f83580041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:29 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:29 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8354000b60 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:30 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:30 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c004cc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:30 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:53:30 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:53:30 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:53:30.870 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:53:31 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:53:31 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:53:31 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:53:31.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:53:31 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:53:31 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:53:31 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:31 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f836c002ad0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:31 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:31 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f83580041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:32 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:32 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f83540016a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:32 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:53:32 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:53:32 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:53:32.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:53:33 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:53:33 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:53:33 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:53:33.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:53:33 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:53:33 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:33 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c004cc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:33 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:33 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c004cc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:34 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:34 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f836c0040b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:34 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:53:34 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:53:34 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:53:34.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:53:35 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:53:35 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:53:35 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:53:35.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:53:35 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:35 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f83580041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:35 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:35 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f83540016a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:36 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:36 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 04:53:36 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:36 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c004cc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:36 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:53:36 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:53:36 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:53:36.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:53:37 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:53:37 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:53:37 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:53:37.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:53:37 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:37 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f836c0040b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:37 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:37 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f83580041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:38 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:53:38 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:38 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f83540016a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:38 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:53:38 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:53:38 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:53:38.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:53:39 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:53:39 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:53:39 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:53:39.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:53:39 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:39 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 04:53:39 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:39 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 04:53:39 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:39 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c004cc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:39 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:39 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f836c0040b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:40 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:40 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f83580041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:40 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:53:40 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:53:40 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:53:40.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:53:41 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:53:41 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:53:41 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:53:41.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:53:41 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:41 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8354002b10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:41 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:41 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8354002b10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:42 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:42 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  6 04:53:42 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:42 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f836c0040b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:42 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:53:42 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:53:42 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:53:42.892 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:53:43 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:53:43 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:53:43 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:53:43.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:53:43 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:53:43 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:43 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f83580041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:43 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:43 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c004cc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:44 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:44 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8354002b10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:44 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:53:44 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:53:44 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:53:44.896 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:53:45 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:53:45 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:53:45 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:53:45.062 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:53:45 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:45 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f836c0040b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:45 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:45 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f83580041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:46 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:46 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c004cc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:46 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:53:46 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:53:46 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:53:46.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:53:47 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:53:47 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:53:47 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:53:47.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:53:47 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/095347 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  6 04:53:47 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:47 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8354003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:47 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:47 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f836c0040b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:48 np0005548916 podman[155431]: 2025-12-06 09:53:48.160889725 +0000 UTC m=+0.119172911 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller)
Dec  6 04:53:48 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:53:48 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:48 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f83580041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:48 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:53:48 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:53:48 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:53:48.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:53:49 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:53:49 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:53:49 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:53:49.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:53:49 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:49 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c004cc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:49 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:49 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8354003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:50 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:50 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f836c0040b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:50 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:53:50 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:53:50 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:53:50.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:53:51 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:53:51 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:53:51 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:53:51.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:53:51 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:51 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f83580041c0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:51 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:51 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f83580041c0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:52 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:52 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8354003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:52 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:53:52 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:53:52 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:53:52.911 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:53:53 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:53:53 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:53:53 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:53:53.076 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:53:53 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:53:53 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:53 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8354003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:53 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:53 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c004cc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:53:54.261 141446 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 04:53:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:53:54.262 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 04:53:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:53:54.262 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 04:53:54 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:54 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f83580041e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:54 np0005548916 podman[159219]: 2025-12-06 09:53:54.774899091 +0000 UTC m=+0.067146588 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Dec  6 04:53:54 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:53:54 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:53:54 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:53:54.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:53:55 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:53:55 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:53:55 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:53:55.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:53:55 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:55 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f83580041e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:55 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:55 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f836c0040b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:56 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:56 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f836c0040b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:56 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:53:56 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:53:56 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:53:56.918 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:53:57 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:53:57 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:53:57 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:53:57.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:53:57 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:57 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8354003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:57 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:57 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f83580041e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:58 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:53:58 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:58 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c004cc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:58 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:53:58 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:53:58 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:53:58.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:53:59 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:53:59 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:53:59 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:53:59.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:53:59 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:59 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f836c0040b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:59 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:59 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8354003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:00 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:54:00 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f83580041e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:00 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:54:00 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:54:00 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:54:00.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:54:01 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:54:01 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:54:01 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:54:01.087 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:54:01 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:54:01 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8364001070 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:01 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:54:01 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f836c0040b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:02 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:54:02 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8354003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:02 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:54:02 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:54:02 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:54:02.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:54:03 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:54:03 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:54:03 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:54:03.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:54:03 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:54:03 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:54:03 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f83580041e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:03 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:54:03 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8364001070 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:04 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:54:04 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f836c0040b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:04 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:54:04 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:54:04 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:54:04.934 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:54:05 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:54:05 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:54:05 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:54:05.094 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:54:05 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:54:05 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f836c0040b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:05 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:54:05 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f83580041e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:06 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:54:06 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8364001fc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:06 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:54:06 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:54:06 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:54:06.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:54:07 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:54:07 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:54:07 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:54:07.098 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:54:07 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:54:07 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f836c0040b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:07 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:54:07 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8354003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:08 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:54:08 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:54:08 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f83580041e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:08 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:54:08 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:54:08 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:54:08.942 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:54:09 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:54:09 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:54:09 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:54:09.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:54:09 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:54:09 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8364001fc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:09 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:54:09 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f836c0040b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:10 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:54:10 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8354003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:10 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:54:10 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:54:10 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:54:10.946 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:54:11 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:54:11 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:54:11 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:54:11.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:54:11 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:54:11 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f83580041e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:11 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:54:11 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8364001fc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:12 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:54:12 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f836c0040b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:12 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:54:12 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:54:12 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:54:12.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:54:13 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:54:13 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:54:13 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:54:13.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:54:13 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:54:13 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:54:13 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8354003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:13 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:54:13 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f83580041e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:14 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:54:14 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8364001fc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:14 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:54:14 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:54:14 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:54:14.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:54:15 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:54:15 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:54:15 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:54:15.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:54:15 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:54:15 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f836c0040b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:15 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:54:15 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8354003c30 fd 48 proxy ignored for local
Dec  6 04:54:15 np0005548916 kernel: ganesha.nfsd[162192]: segfault at 50 ip 00007f8434c9b32e sp 00007f83e9ffa210 error 4 in libntirpc.so.5.8[7f8434c80000+2c000] likely on CPU 6 (core 0, socket 6)
Dec  6 04:54:15 np0005548916 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Dec  6 04:54:15 np0005548916 systemd[1]: Started Process Core Dump (PID 165495/UID 0).
Dec  6 04:54:16 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:54:16 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:54:16 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:54:16.956 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:54:17 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:54:17 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:54:17 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:54:17.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:54:18 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:54:18 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:54:18 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:54:18 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:54:18.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:54:19 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:54:19 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:54:19 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:54:19.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:54:19 np0005548916 systemd-coredump[165496]: Process 130685 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 67:#012#0  0x00007f8434c9b32e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Dec  6 04:54:19 np0005548916 podman[165499]: 2025-12-06 09:54:19.928122694 +0000 UTC m=+1.226087114 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  6 04:54:19 np0005548916 systemd[1]: systemd-coredump@5-165495-0.service: Deactivated successfully.
Dec  6 04:54:19 np0005548916 systemd[1]: systemd-coredump@5-165495-0.service: Consumed 1.492s CPU time.
Dec  6 04:54:20 np0005548916 podman[165530]: 2025-12-06 09:54:20.035802592 +0000 UTC m=+0.031149655 container died aeb4a191b30e3d0e639fe714012cb8167b13d0245e7a274e7aa6d996a80dbf01 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, CEPH_REF=squid, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2)
Dec  6 04:54:20 np0005548916 systemd[1]: var-lib-containers-storage-overlay-40002f69ba8adcfb87c67e5821ee91e412b0c3574a69c61d93dc56a081e3f1b8-merged.mount: Deactivated successfully.
Dec  6 04:54:20 np0005548916 podman[165530]: 2025-12-06 09:54:20.177507138 +0000 UTC m=+0.172854171 container remove aeb4a191b30e3d0e639fe714012cb8167b13d0245e7a274e7aa6d996a80dbf01 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_REF=squid, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS)
Dec  6 04:54:20 np0005548916 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.0.0.compute-1.djsnbu.service: Main process exited, code=exited, status=139/n/a
Dec  6 04:54:20 np0005548916 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.0.0.compute-1.djsnbu.service: Failed with result 'exit-code'.
Dec  6 04:54:20 np0005548916 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.0.0.compute-1.djsnbu.service: Consumed 2.123s CPU time.
Dec  6 04:54:20 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:54:20 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:54:20 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:54:20.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:54:21 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:54:21 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:54:21 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:54:21.121 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:54:21 np0005548916 kernel: SELinux:  Converting 2774 SID table entries...
Dec  6 04:54:21 np0005548916 kernel: SELinux:  policy capability network_peer_controls=1
Dec  6 04:54:21 np0005548916 kernel: SELinux:  policy capability open_perms=1
Dec  6 04:54:21 np0005548916 kernel: SELinux:  policy capability extended_socket_class=1
Dec  6 04:54:21 np0005548916 kernel: SELinux:  policy capability always_check_network=0
Dec  6 04:54:21 np0005548916 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec  6 04:54:21 np0005548916 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec  6 04:54:21 np0005548916 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec  6 04:54:22 np0005548916 dbus-broker-launch[770]: Noticed file-system modification, trigger reload.
Dec  6 04:54:22 np0005548916 dbus-broker-launch[774]: avc:  op=load_policy lsm=selinux seqno=14 res=1
Dec  6 04:54:22 np0005548916 dbus-broker-launch[770]: Noticed file-system modification, trigger reload.
Dec  6 04:54:22 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:54:22 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:54:22 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:54:22.968 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:54:23 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:54:23 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:54:23 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:54:23.124 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:54:23 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:54:24 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:54:24 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:54:24 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:54:24.971 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:54:25 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:54:25 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:54:25 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:54:25.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:54:25 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/095425 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  6 04:54:25 np0005548916 podman[165648]: 2025-12-06 09:54:25.767345949 +0000 UTC m=+0.067901083 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Dec  6 04:54:26 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:54:26 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:54:26 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:54:26.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:54:27 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:54:27 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:54:27 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:54:27.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:54:28 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:54:28 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:54:28 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:54:28 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:54:28.977 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:54:29 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:54:29 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:54:29 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:54:29.134 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:54:30 np0005548916 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.0.0.compute-1.djsnbu.service: Scheduled restart job, restart counter is at 6.
Dec  6 04:54:30 np0005548916 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.djsnbu for 5ecd3f74-dade-5fc4-92ce-8950ae424258.
Dec  6 04:54:30 np0005548916 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.0.0.compute-1.djsnbu.service: Consumed 2.123s CPU time.
Dec  6 04:54:30 np0005548916 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.djsnbu for 5ecd3f74-dade-5fc4-92ce-8950ae424258...
Dec  6 04:54:30 np0005548916 podman[166348]: 2025-12-06 09:54:30.728618311 +0000 UTC m=+0.055959744 container create 59c3a18112ee7376f7e084c537acf33fec4744253b3178b4083465a9740dedf8 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  6 04:54:30 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/674595a2f8d871ddef4522155fda703c933fe31e7b86dbc4d96e00021066cf79/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Dec  6 04:54:30 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/674595a2f8d871ddef4522155fda703c933fe31e7b86dbc4d96e00021066cf79/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Dec  6 04:54:30 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/674595a2f8d871ddef4522155fda703c933fe31e7b86dbc4d96e00021066cf79/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  6 04:54:30 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/674595a2f8d871ddef4522155fda703c933fe31e7b86dbc4d96e00021066cf79/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.djsnbu-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Dec  6 04:54:30 np0005548916 podman[166348]: 2025-12-06 09:54:30.698824953 +0000 UTC m=+0.026166406 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  6 04:54:30 np0005548916 podman[166348]: 2025-12-06 09:54:30.800610969 +0000 UTC m=+0.127952432 container init 59c3a18112ee7376f7e084c537acf33fec4744253b3178b4083465a9740dedf8 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, OSD_FLAVOR=default, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  6 04:54:30 np0005548916 podman[166348]: 2025-12-06 09:54:30.806526021 +0000 UTC m=+0.133867454 container start 59c3a18112ee7376f7e084c537acf33fec4744253b3178b4083465a9740dedf8 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325)
Dec  6 04:54:30 np0005548916 bash[166348]: 59c3a18112ee7376f7e084c537acf33fec4744253b3178b4083465a9740dedf8
Dec  6 04:54:30 np0005548916 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.djsnbu for 5ecd3f74-dade-5fc4-92ce-8950ae424258.
Dec  6 04:54:30 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:30 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Dec  6 04:54:30 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:30 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Dec  6 04:54:30 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:30 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Dec  6 04:54:30 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:30 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Dec  6 04:54:30 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:30 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Dec  6 04:54:30 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:30 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Dec  6 04:54:30 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:30 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Dec  6 04:54:30 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:30 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 04:54:30 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:54:30 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:54:30 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:54:30.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:54:31 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:54:31 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:54:31 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:54:31.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:54:31 np0005548916 systemd[1]: Stopping OpenSSH server daemon...
Dec  6 04:54:31 np0005548916 systemd[1]: sshd.service: Deactivated successfully.
Dec  6 04:54:31 np0005548916 systemd[1]: Stopped OpenSSH server daemon.
Dec  6 04:54:31 np0005548916 systemd[1]: sshd.service: Consumed 3.137s CPU time, read 564.0K from disk, written 0B to disk.
Dec  6 04:54:31 np0005548916 systemd[1]: Stopped target sshd-keygen.target.
Dec  6 04:54:31 np0005548916 systemd[1]: Stopping sshd-keygen.target...
Dec  6 04:54:31 np0005548916 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec  6 04:54:31 np0005548916 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec  6 04:54:31 np0005548916 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec  6 04:54:31 np0005548916 systemd[1]: Reached target sshd-keygen.target.
Dec  6 04:54:31 np0005548916 systemd[1]: Starting OpenSSH server daemon...
Dec  6 04:54:31 np0005548916 systemd[1]: Started OpenSSH server daemon.
Dec  6 04:54:31 np0005548916 podman[166761]: 2025-12-06 09:54:31.551201935 +0000 UTC m=+0.070690725 container exec 500f8c89b5c281d0ace139835b07ea5bc6259ce3e028d8284d57da97424b55e0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-crash-compute-1, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, ceph=True, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  6 04:54:31 np0005548916 podman[166761]: 2025-12-06 09:54:31.678793646 +0000 UTC m=+0.198282396 container exec_died 500f8c89b5c281d0ace139835b07ea5bc6259ce3e028d8284d57da97424b55e0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-crash-compute-1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, ceph=True, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  6 04:54:32 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:54:32 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:54:32 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:54:32.983 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:54:33 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:54:33 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:54:33 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:54:33.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:54:33 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:54:33 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Dec  6 04:54:33 np0005548916 podman[166956]: 2025-12-06 09:54:33.490672304 +0000 UTC m=+1.283561218 container exec 6af22af7046e22bedbb2fb280e4d2c530c5b3cac3959f396bf7fe3d14752a7eb (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec  6 04:54:33 np0005548916 podman[166956]: 2025-12-06 09:54:33.504569303 +0000 UTC m=+1.297458207 container exec_died 6af22af7046e22bedbb2fb280e4d2c530c5b3cac3959f396bf7fe3d14752a7eb (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec  6 04:54:33 np0005548916 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec  6 04:54:33 np0005548916 systemd[1]: Starting man-db-cache-update.service...
Dec  6 04:54:33 np0005548916 podman[167138]: 2025-12-06 09:54:33.912668281 +0000 UTC m=+0.086099433 container exec 59c3a18112ee7376f7e084c537acf33fec4744253b3178b4083465a9740dedf8 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, OSD_FLAVOR=default)
Dec  6 04:54:33 np0005548916 podman[167138]: 2025-12-06 09:54:33.926773305 +0000 UTC m=+0.100204437 container exec_died 59c3a18112ee7376f7e084c537acf33fec4744253b3178b4083465a9740dedf8 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, CEPH_REF=squid, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec  6 04:54:34 np0005548916 systemd[1]: Reloading.
Dec  6 04:54:34 np0005548916 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:54:34 np0005548916 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:54:34 np0005548916 podman[167271]: 2025-12-06 09:54:34.21651274 +0000 UTC m=+0.069505164 container exec 70891cd2190622057f9c45299e27938f7b2105f0244eda3658dedfb18fed50f0 (image=quay.io/ceph/haproxy:2.3, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd)
Dec  6 04:54:34 np0005548916 podman[167271]: 2025-12-06 09:54:34.228809137 +0000 UTC m=+0.081801561 container exec_died 70891cd2190622057f9c45299e27938f7b2105f0244eda3658dedfb18fed50f0 (image=quay.io/ceph/haproxy:2.3, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd)
Dec  6 04:54:34 np0005548916 systemd[1]: Queuing reload/restart jobs for marked units…
Dec  6 04:54:34 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:54:34 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:54:34 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Dec  6 04:54:34 np0005548916 podman[167584]: 2025-12-06 09:54:34.957284703 +0000 UTC m=+0.517641487 container exec c8ec7212805c01399bc295ce2c5e69b11fbde393e887859b5ab336e81cd6d1f1 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-1-uzbtlt, build-date=2023-02-22T09:23:20, io.openshift.tags=Ceph keepalived, distribution-scope=public, name=keepalived, release=1793, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.buildah.version=1.28.2, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides keepalived on RHEL 9 for Ceph., vcs-type=git, io.openshift.expose-services=, io.k8s.display-name=Keepalived on RHEL 9, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, version=2.2.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=keepalived-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., description=keepalived for Ceph, architecture=x86_64)
Dec  6 04:54:34 np0005548916 podman[167584]: 2025-12-06 09:54:34.974785324 +0000 UTC m=+0.535142108 container exec_died c8ec7212805c01399bc295ce2c5e69b11fbde393e887859b5ab336e81cd6d1f1 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-1-uzbtlt, description=keepalived for Ceph, name=keepalived, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Keepalived on RHEL 9, release=1793, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, architecture=x86_64, io.buildah.version=1.28.2, io.openshift.tags=Ceph keepalived, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides keepalived on RHEL 9 for Ceph., version=2.2.4, io.openshift.expose-services=, com.redhat.component=keepalived-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, build-date=2023-02-22T09:23:20)
Dec  6 04:54:34 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:54:34 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:54:34 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:54:34.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:54:35 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:54:35 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:54:35 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:54:35.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:54:35 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:54:35 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:54:35 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Dec  6 04:54:35 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  6 04:54:35 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:54:35 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:54:35 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  6 04:54:36 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:36 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 04:54:36 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:36 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 04:54:36 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:54:36 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:54:36 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:54:36.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:54:37 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:54:37 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:54:37 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:54:37.147 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:54:38 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:54:38 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:54:38 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:54:38 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:54:38.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:54:39 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:54:39 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:54:39 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:54:39.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:54:40 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:54:40 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:54:40 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:54:40.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:54:41 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:54:41 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:54:41 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:54:41.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:54:42 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:54:42 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:54:43 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:54:43 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:54:43 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:54:42.999 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:54:43 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:43 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  6 04:54:43 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:43 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Dec  6 04:54:43 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:43 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Dec  6 04:54:43 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:43 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Dec  6 04:54:43 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:43 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Dec  6 04:54:43 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:43 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Dec  6 04:54:43 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:43 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Dec  6 04:54:43 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:43 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  6 04:54:43 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:43 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  6 04:54:43 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:43 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  6 04:54:43 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:43 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Dec  6 04:54:43 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:43 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  6 04:54:43 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:43 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Dec  6 04:54:43 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:43 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Dec  6 04:54:43 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:43 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Dec  6 04:54:43 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:43 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Dec  6 04:54:43 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:43 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Dec  6 04:54:43 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:43 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Dec  6 04:54:43 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:43 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Dec  6 04:54:43 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:43 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Dec  6 04:54:43 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:43 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Dec  6 04:54:43 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:43 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Dec  6 04:54:43 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:43 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Dec  6 04:54:43 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:43 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Dec  6 04:54:43 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:43 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec  6 04:54:43 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:43 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Dec  6 04:54:43 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:43 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec  6 04:54:43 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:54:43 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:54:43 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:54:43.155 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:54:43 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:54:43 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:43 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb218000df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:43 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:43 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2140013b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:43 np0005548916 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec  6 04:54:43 np0005548916 systemd[1]: Finished man-db-cache-update.service.
Dec  6 04:54:43 np0005548916 systemd[1]: man-db-cache-update.service: Consumed 12.685s CPU time.
Dec  6 04:54:43 np0005548916 systemd[1]: run-r155ba3ec05ea4cb393ee9881f7853740.service: Deactivated successfully.
Dec  6 04:54:44 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:44 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f4000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:45 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:54:45 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:54:45 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:54:45.004 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:54:45 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:54:45 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:54:45 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:54:45.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:54:45 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:45 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f0000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:45 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/095445 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  6 04:54:45 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:45 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c002250 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:46 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:46 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb214002090 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:47 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:54:47 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:54:47 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:54:47.007 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:54:47 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:54:47 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:54:47 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:54:47.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:54:47 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:47 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f40016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:47 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:47 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f00016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:48 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:54:48 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:48 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c002d50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:49 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:54:49 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:54:49 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:54:49.011 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:54:49 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:54:49 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:54:49 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:54:49.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:54:49 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:49 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb214002090 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:49 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:49 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f40016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:50 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:50 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f00016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:50 np0005548916 podman[175916]: 2025-12-06 09:54:50.834439672 +0000 UTC m=+0.134962704 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  6 04:54:51 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:54:51 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:54:51 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:54:51.015 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:54:51 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:54:51 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:54:51 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:54:51.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:54:51 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:51 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f00016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:51 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:51 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb214002090 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:52 np0005548916 python3.9[176073]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec  6 04:54:52 np0005548916 systemd[1]: Reloading.
Dec  6 04:54:52 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:52 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f40016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:52 np0005548916 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:54:52 np0005548916 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:54:53 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:54:53 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:54:53 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:54:53.018 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:54:53 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:54:53 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:54:53 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:54:53.172 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:54:53 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:54:53 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:53 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f00016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:53 np0005548916 python3.9[176263]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec  6 04:54:53 np0005548916 systemd[1]: Reloading.
Dec  6 04:54:53 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:53 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c002d50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:53 np0005548916 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:54:53 np0005548916 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:54:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:54:54.263 141446 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 04:54:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:54:54.265 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 04:54:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:54:54.266 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 04:54:54 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:54 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb214003520 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:54 np0005548916 python3.9[176454]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec  6 04:54:54 np0005548916 systemd[1]: Reloading.
Dec  6 04:54:55 np0005548916 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:54:55 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:54:55 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:54:55 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:54:55.021 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:54:55 np0005548916 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:54:55 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:54:55 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:54:55 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:54:55.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:54:55 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:55 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f4002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:55 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:55 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f00016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:56 np0005548916 python3.9[176644]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec  6 04:54:56 np0005548916 systemd[1]: Reloading.
Dec  6 04:54:56 np0005548916 podman[176646]: 2025-12-06 09:54:56.114100249 +0000 UTC m=+0.060075521 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_metadata_agent)
Dec  6 04:54:56 np0005548916 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:54:56 np0005548916 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:54:56 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:56 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c002d50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:57 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:54:57 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:54:57 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:54:57.025 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:54:57 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:54:57 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:54:57 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:54:57.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:54:57 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:57 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb214003520 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:57 np0005548916 python3.9[176853]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  6 04:54:57 np0005548916 systemd[1]: Reloading.
Dec  6 04:54:57 np0005548916 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:54:57 np0005548916 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:54:57 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:57 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f4002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:58 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:54:58 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:58 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f00032f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:58 np0005548916 python3.9[177044]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  6 04:54:58 np0005548916 systemd[1]: Reloading.
Dec  6 04:54:58 np0005548916 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:54:58 np0005548916 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:54:59 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:54:59 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:54:59 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:54:59.029 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:54:59 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:54:59 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:54:59 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:54:59.181 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:54:59 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:59 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c002d50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:59 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:59 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb214003520 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:59 np0005548916 python3.9[177234]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  6 04:54:59 np0005548916 systemd[1]: Reloading.
Dec  6 04:55:00 np0005548916 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:55:00 np0005548916 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:55:00 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:00 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f4002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:01 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:55:01 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:55:01 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:55:01.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:55:01 np0005548916 python3.9[177425]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  6 04:55:01 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:55:01 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:55:01 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:55:01.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:55:01 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:01 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f00032f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:01 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:01 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c002d50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:02 np0005548916 python3.9[177580]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  6 04:55:02 np0005548916 systemd[1]: Reloading.
Dec  6 04:55:02 np0005548916 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:55:02 np0005548916 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:55:02 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:02 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb214003520 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:03 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:55:03 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:55:03 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:55:03.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:55:03 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:55:03 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:55:03 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:55:03.189 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:55:03 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:55:03 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:03 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:03 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:03 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:04 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:04 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c002d50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:04 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/095504 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 1ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  6 04:55:05 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:55:05 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:55:05 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:55:05.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:55:05 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:55:05 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:55:05 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:55:05.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:55:05 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:05 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb214003520 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:05 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:05 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:05 np0005548916 python3.9[177772]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-tls.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec  6 04:55:06 np0005548916 systemd[1]: Reloading.
Dec  6 04:55:06 np0005548916 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:55:06 np0005548916 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:55:06 np0005548916 systemd[1]: Listening on libvirt proxy daemon socket.
Dec  6 04:55:06 np0005548916 systemd[1]: Listening on libvirt proxy daemon TLS IP socket.
Dec  6 04:55:06 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:06 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:07 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:55:07 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:55:07 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:55:07.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:55:07 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:55:07 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:55:07 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:55:07.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:55:07 np0005548916 python3.9[177965]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  6 04:55:07 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:07 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c002d50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:07 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:07 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb214003520 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:08 np0005548916 python3.9[178120]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  6 04:55:08 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:55:08 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:08 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:09 np0005548916 python3.9[178301]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  6 04:55:09 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:55:09 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:55:09 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:55:09.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:55:09 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:55:09 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:55:09 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:55:09.198 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:55:09 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:09 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:09 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:09 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c002d50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:09 np0005548916 python3.9[178456]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  6 04:55:10 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:10 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb214003520 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:10 np0005548916 python3.9[178612]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  6 04:55:11 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:55:11 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:55:11 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:55:11.053 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:55:11 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:55:11 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:55:11 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:55:11.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:55:11 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:11 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:11 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:11 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:12 np0005548916 python3.9[178768]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  6 04:55:12 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:12 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c002d50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:13 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:55:13 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:55:13 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:55:13.056 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:55:13 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:55:13 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:55:13 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:55:13.204 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:55:13 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:55:13 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:13 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb214003520 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:13 np0005548916 python3.9[178923]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  6 04:55:13 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:13 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb214003520 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:14 np0005548916 python3.9[179078]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  6 04:55:14 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:14 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 04:55:14 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:14 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:15 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:55:15 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:55:15 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:55:15.059 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:55:15 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:55:15 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:55:15 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:55:15.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:55:15 np0005548916 python3.9[179237]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  6 04:55:15 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:15 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c002d50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:15 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:15 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb218001d70 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:16 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:16 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:17 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:55:17 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:55:17 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:55:17.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:55:17 np0005548916 python3.9[179393]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  6 04:55:17 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:55:17 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:55:17 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:55:17.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:55:17 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:17 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 04:55:17 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:17 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 04:55:17 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:17 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:17 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:17 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c002d50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:18 np0005548916 python3.9[179548]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  6 04:55:18 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:55:18 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:18 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c002d50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:18 np0005548916 python3.9[179704]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  6 04:55:19 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:55:19 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:55:19 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:55:19.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:55:19 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:55:19 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:55:19 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:55:19.213 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:55:19 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:19 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c002d50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:19 np0005548916 python3.9[179859]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  6 04:55:19 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:19 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:20 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:20 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  6 04:55:20 np0005548916 python3.9[180015]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  6 04:55:20 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:20 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c002d50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:21 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:55:21 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:55:21 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:55:21.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:55:21 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:55:21 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:55:21 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:55:21.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:55:21 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:21 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c002d50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:21 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:21 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb218000df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:21 np0005548916 podman[180043]: 2025-12-06 09:55:21.835770009 +0000 UTC m=+0.132579191 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec  6 04:55:22 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:22 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:23 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:55:23 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:55:23 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:55:23.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:55:23 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:55:23 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:55:23 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:55:23.222 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:55:23 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:55:23 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:23 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c002d50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:23 np0005548916 python3.9[180197]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:55:23 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:23 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb218000df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:24 np0005548916 python3.9[180349]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:55:24 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:24 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:24 np0005548916 python3.9[180502]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:55:25 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:55:25 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:55:25 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:55:25.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:55:25 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:55:25 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:55:25 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:55:25.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:55:25 np0005548916 python3.9[180654]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:55:25 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:25 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e00016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:25 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:25 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c002d50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:26 np0005548916 python3.9[180806]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:55:26 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:26 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb218000df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:26 np0005548916 podman[180931]: 2025-12-06 09:55:26.744938289 +0000 UTC m=+0.066273241 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec  6 04:55:26 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/095526 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  6 04:55:26 np0005548916 python3.9[180974]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:55:27 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:55:27 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:55:27 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:55:27.083 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:55:27 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:55:27 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:55:27 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:55:27.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:55:27 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:27 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:27 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:27 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e00016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:28 np0005548916 python3.9[181130]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:55:28 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:55:28 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:28 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c002d50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:29 np0005548916 python3.9[181281]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1765014927.6592553-1623-269269377540596/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:55:29 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:55:29 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:55:29 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:55:29.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:55:29 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:55:29 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:55:29 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:55:29.231 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:55:29 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:29 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb218000df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:29 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:29 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:29 np0005548916 python3.9[181433]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:55:30 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:30 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb218000df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:30 np0005548916 python3.9[181559]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1765014929.2874403-1623-88322411598992/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:55:31 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:55:31 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:55:31 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:55:31.089 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:55:31 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:55:31 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:55:31 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:55:31.235 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:55:31 np0005548916 python3.9[181711]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:55:31 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:31 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e00016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:31 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:31 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c004a20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:32 np0005548916 python3.9[181836]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1765014930.9420683-1623-256027691618372/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:55:32 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:32 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:32 np0005548916 python3.9[181989]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:55:33 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:55:33 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:55:33 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:55:33.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:55:33 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:55:33 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:55:33 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:55:33.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:55:33 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:55:33 np0005548916 python3.9[182114]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1765014932.260272-1623-129593959920977/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:55:33 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:33 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb218009a40 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:33 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:33 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb218009a40 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:34 np0005548916 python3.9[182266]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:55:34 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:34 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c004a20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:34 np0005548916 python3.9[182392]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1765014933.7534971-1623-115910026622844/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=c44de21af13c90603565570f09ff60c6a41ed8df backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:55:35 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:55:35 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:55:35 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:55:35.098 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:55:35 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:55:35 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:55:35 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:55:35.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:55:35 np0005548916 python3.9[182544]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:55:35 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:35 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb218009a40 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:35 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:35 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:36 np0005548916 python3.9[182669]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1765014934.9705372-1623-270543221199491/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:55:36 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:36 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:36 np0005548916 python3.9[182822]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:55:37 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:55:37 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:55:37 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:55:37.103 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:55:37 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:55:37 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:55:37 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:55:37.246 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:55:37 np0005548916 python3.9[182945]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1765014936.2861311-1623-256080245427790/.source.conf follow=False _original_basename=auth.conf checksum=a94cd818c374cec2c8425b70d2e0e2f41b743ae4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:55:37 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:37 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:37 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:37 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb218009a40 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:38 np0005548916 python3.9[183097]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:55:38 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:55:38 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:38 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:38 np0005548916 python3.9[183223]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1765014937.6553319-1623-227135029814572/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:55:39 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:55:39 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:55:39 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:55:39.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:55:39 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:55:39 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:55:39 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:55:39.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:55:39 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:39 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:39 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:39 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c004a20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:40 np0005548916 python3.9[183376]: ansible-ansible.legacy.command Invoked with cmd=saslpasswd2 -f /etc/libvirt/passwd.db -p -a libvirt -u openstack migration stdin=12345678 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None
Dec  6 04:55:40 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:40 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb218009a40 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:41 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:55:41 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:55:41 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:55:41.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:55:41 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:55:41 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:55:41 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:55:41.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:55:41 np0005548916 python3.9[183529]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:55:41 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:41 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:41 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:41 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:42 np0005548916 python3.9[183681]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:55:42 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:42 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c004a20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:42 np0005548916 python3.9[183891]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:55:43 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:55:43 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:55:43 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:55:43.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:55:43 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:55:43 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:55:43 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:55:43.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:55:43 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:55:43 np0005548916 python3.9[184066]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:55:43 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:43 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb218009a40 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:43 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:43 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:44 np0005548916 python3.9[184218]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:55:44 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:44 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:44 np0005548916 python3.9[184371]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:55:45 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:55:45 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:55:45 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:55:45.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:55:45 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:55:45 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:55:45 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:55:45.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:55:45 np0005548916 python3.9[184523]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:55:45 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:45 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb218009a40 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:45 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:45 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb218009a40 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:46 np0005548916 python3.9[184677]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:55:46 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:46 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:46 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:55:46 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:55:46 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  6 04:55:46 np0005548916 python3.9[184830]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:55:47 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:55:47 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:55:47 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:55:47.117 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:55:47 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:55:47 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:55:47 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:55:47.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:55:47 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:47 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:47 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:55:47 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:55:47 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  6 04:55:47 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:47 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e8000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:47 np0005548916 python3.9[184982]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:55:48 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:55:48 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:48 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c004a20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:48 np0005548916 python3.9[185136]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:55:49 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:55:49 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:55:49 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:55:49.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:55:49 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:55:49 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:55:49 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:55:49.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:55:49 np0005548916 python3.9[185312]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:55:49 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:49 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb214001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:49 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:49 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:50 np0005548916 python3.9[185464]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:55:50 np0005548916 python3.9[185617]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:55:50 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:50 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e80016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:51 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:55:51 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:55:51 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:55:51.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:55:51 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:55:51 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:55:51 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:55:51.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:55:51 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:51 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c004a20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:51 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:51 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2140029b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:51 np0005548916 python3.9[185769]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:55:52 np0005548916 podman[185815]: 2025-12-06 09:55:52.149266552 +0000 UTC m=+0.088141295 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller)
Dec  6 04:55:52 np0005548916 python3.9[185944]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014951.4663227-2286-127800449495628/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:55:52 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:52 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:52 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:55:52 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:55:53 np0005548916 python3.9[186096]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:55:53 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:55:53 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:55:53 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:55:53.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:55:53 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:55:53 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:55:53 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:55:53.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:55:53 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:55:53 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:53 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:53 np0005548916 python3.9[186219]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014952.6533813-2286-135091598438569/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:55:53 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:53 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c004a20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:55:54.264 141446 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 04:55:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:55:54.265 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 04:55:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:55:54.265 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 04:55:54 np0005548916 python3.9[186371]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:55:54 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:54 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2140029b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:54 np0005548916 python3.9[186495]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014953.86701-2286-218499335610294/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:55:55 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:55:55 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:55:55 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:55:55.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:55:55 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:55:55 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:55:55 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:55:55.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:55:55 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:55 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e8001fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:55 np0005548916 python3.9[186647]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:55:55 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:55 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:56 np0005548916 python3.9[186770]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014955.134994-2286-132600693924501/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:55:56 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:56 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c004a20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:56 np0005548916 python3.9[186923]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:55:57 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:55:57 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:55:57 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:55:57.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:55:57 np0005548916 podman[187018]: 2025-12-06 09:55:57.196214895 +0000 UTC m=+0.055871763 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec  6 04:55:57 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:55:57 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:55:57 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:55:57.279 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:55:57 np0005548916 python3.9[187065]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014956.3551629-2286-241862508862875/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:55:57 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:57 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2140029b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:57 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:57 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e8001fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:58 np0005548916 python3.9[187217]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:55:58 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:55:58 np0005548916 python3.9[187341]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014957.5776188-2286-159478002584401/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:55:58 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:58 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:59 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:55:59 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:55:59 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:55:59.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:55:59 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:55:59 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:55:59 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:55:59.282 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:55:59 np0005548916 python3.9[187493]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:55:59 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:59 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c004a20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:59 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:59 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c004a20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:59 np0005548916 python3.9[187616]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014958.7831938-2286-58637370012528/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:56:00 np0005548916 python3.9[187769]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:56:00 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:00 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e8001fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:01 np0005548916 python3.9[187892]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014960.0902197-2286-99531107673205/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:56:01 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:56:01 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:56:01 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:56:01.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:56:01 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:56:01 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:56:01 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:56:01.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:56:01 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:01 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:01 np0005548916 python3.9[188044]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:56:01 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:01 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c004a20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:02 np0005548916 python3.9[188167]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014961.3119662-2286-208247083521326/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:56:02 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:02 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2140037d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:02 np0005548916 python3.9[188320]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:56:03 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:56:03 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:56:03 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:56:03.143 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:56:03 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:56:03 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:56:03 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:56:03.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:56:03 np0005548916 python3.9[188443]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014962.4706473-2286-15615144229666/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:56:03 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:56:03 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:03 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e80032f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:03 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:03 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:04 np0005548916 python3.9[188595]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:56:04 np0005548916 python3.9[188719]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014963.6607192-2286-251782830922038/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:56:04 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:04 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c004a20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:05 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:56:05 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:56:05 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:56:05.147 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:56:05 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:56:05 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:56:05 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:56:05.292 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:56:05 np0005548916 python3.9[188871]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:56:05 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:05 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2140037d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:05 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:05 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e80032f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:05 np0005548916 python3.9[188994]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014964.8524275-2286-52076285670208/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:56:06 np0005548916 python3.9[189147]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:56:06 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:06 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:07 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:56:07 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:56:07 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:56:07.151 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:56:07 np0005548916 python3.9[189270]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014966.1406472-2286-32012174891742/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:56:07 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:56:07 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:56:07 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:56:07.296 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:56:07 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:07 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c004a20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:07 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/095607 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  6 04:56:07 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:07 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2140037d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:07 np0005548916 python3.9[189422]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:56:08 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:56:08 np0005548916 python3.9[189546]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014967.4085715-2286-132152143828031/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:56:08 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:08 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e8004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:09 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:56:09 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:56:09 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:56:09.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:56:09 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:56:09 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:56:09 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:56:09.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:56:09 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:09 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:09 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:09 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c004a20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:10 np0005548916 ceph-mon[79770]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #28. Immutable memtables: 0.
Dec  6 04:56:10 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:56:10.002398) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec  6 04:56:10 np0005548916 ceph-mon[79770]: rocksdb: [db/flush_job.cc:856] [default] [JOB 13] Flushing memtable with next log file: 28
Dec  6 04:56:10 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014970002640, "job": 13, "event": "flush_started", "num_memtables": 1, "num_entries": 4355, "num_deletes": 501, "total_data_size": 11825558, "memory_usage": 11978824, "flush_reason": "Manual Compaction"}
Dec  6 04:56:10 np0005548916 ceph-mon[79770]: rocksdb: [db/flush_job.cc:885] [default] [JOB 13] Level-0 flush table #29: started
Dec  6 04:56:10 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014970040683, "cf_name": "default", "job": 13, "event": "table_file_creation", "file_number": 29, "file_size": 4439644, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13251, "largest_seqno": 17601, "table_properties": {"data_size": 4428209, "index_size": 6457, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3909, "raw_key_size": 30991, "raw_average_key_size": 19, "raw_value_size": 4401079, "raw_average_value_size": 2824, "num_data_blocks": 282, "num_entries": 1558, "num_filter_entries": 1558, "num_deletions": 501, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765014557, "oldest_key_time": 1765014557, "file_creation_time": 1765014970, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79d3ff52-4bd2-4fdc-8b55-33dd9c53d8e0", "db_session_id": "1TK25AVRA1WQDS4JHM8T", "orig_file_number": 29, "seqno_to_time_mapping": "N/A"}}
Dec  6 04:56:10 np0005548916 ceph-mon[79770]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 13] Flush lasted 38271 microseconds, and 12276 cpu microseconds.
Dec  6 04:56:10 np0005548916 ceph-mon[79770]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  6 04:56:10 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:56:10.040761) [db/flush_job.cc:967] [default] [JOB 13] Level-0 flush table #29: 4439644 bytes OK
Dec  6 04:56:10 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:56:10.040805) [db/memtable_list.cc:519] [default] Level-0 commit table #29 started
Dec  6 04:56:10 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:56:10.042867) [db/memtable_list.cc:722] [default] Level-0 commit table #29: memtable #1 done
Dec  6 04:56:10 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:56:10.042894) EVENT_LOG_v1 {"time_micros": 1765014970042888, "job": 13, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec  6 04:56:10 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:56:10.042916) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec  6 04:56:10 np0005548916 ceph-mon[79770]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 13] Try to delete WAL files size 11806248, prev total WAL file size 11806248, number of live WAL files 2.
Dec  6 04:56:10 np0005548916 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000025.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 04:56:10 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:56:10.046103) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D67727374617400323530' seq:72057594037927935, type:22 .. '6D67727374617400353031' seq:0, type:0; will stop at (end)
Dec  6 04:56:10 np0005548916 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 14] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec  6 04:56:10 np0005548916 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 13 Base level 0, inputs: [29(4335KB)], [27(13MB)]
Dec  6 04:56:10 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014970046268, "job": 14, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [29], "files_L6": [27], "score": -1, "input_data_size": 18464773, "oldest_snapshot_seqno": -1}
Dec  6 04:56:10 np0005548916 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 14] Generated table #30: 5031 keys, 13936321 bytes, temperature: kUnknown
Dec  6 04:56:10 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014970178300, "cf_name": "default", "job": 14, "event": "table_file_creation", "file_number": 30, "file_size": 13936321, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13900701, "index_size": 21942, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 12613, "raw_key_size": 125913, "raw_average_key_size": 25, "raw_value_size": 13807514, "raw_average_value_size": 2744, "num_data_blocks": 917, "num_entries": 5031, "num_filter_entries": 5031, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765014012, "oldest_key_time": 0, "file_creation_time": 1765014970, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79d3ff52-4bd2-4fdc-8b55-33dd9c53d8e0", "db_session_id": "1TK25AVRA1WQDS4JHM8T", "orig_file_number": 30, "seqno_to_time_mapping": "N/A"}}
Dec  6 04:56:10 np0005548916 ceph-mon[79770]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  6 04:56:10 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:56:10.178807) [db/compaction/compaction_job.cc:1663] [default] [JOB 14] Compacted 1@0 + 1@6 files to L6 => 13936321 bytes
Dec  6 04:56:10 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:56:10.179962) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 139.6 rd, 105.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(4.2, 13.4 +0.0 blob) out(13.3 +0.0 blob), read-write-amplify(7.3) write-amplify(3.1) OK, records in: 5852, records dropped: 821 output_compression: NoCompression
Dec  6 04:56:10 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:56:10.179979) EVENT_LOG_v1 {"time_micros": 1765014970179971, "job": 14, "event": "compaction_finished", "compaction_time_micros": 132243, "compaction_time_cpu_micros": 48198, "output_level": 6, "num_output_files": 1, "total_output_size": 13936321, "num_input_records": 5852, "num_output_records": 5031, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec  6 04:56:10 np0005548916 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000029.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 04:56:10 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014970181120, "job": 14, "event": "table_file_deletion", "file_number": 29}
Dec  6 04:56:10 np0005548916 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000027.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 04:56:10 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014970184266, "job": 14, "event": "table_file_deletion", "file_number": 27}
Dec  6 04:56:10 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:56:10.045943) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 04:56:10 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:56:10.184439) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 04:56:10 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:56:10.184447) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 04:56:10 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:56:10.184449) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 04:56:10 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:56:10.184451) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 04:56:10 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:56:10.184453) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 04:56:10 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:10 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2140037d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:11 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:56:11 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:56:11 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:56:11.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:56:11 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:56:11 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:56:11 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:56:11.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:56:11 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:11 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e8004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:11 np0005548916 python3.9[189722]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail#012ls -lRZ /run/libvirt | grep -E ':container_\S+_t'#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:56:11 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:11 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:12 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:12 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c004a20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:12 np0005548916 python3.9[189878]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Dec  6 04:56:13 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:56:13 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:56:13 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:56:13.159 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:56:13 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:56:13 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:56:13 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:56:13.306 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:56:13 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:56:13 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:13 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2140037d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:13 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:13 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e8004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:14 np0005548916 dbus-broker-launch[774]: avc:  op=load_policy lsm=selinux seqno=15 res=1
Dec  6 04:56:14 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:14 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:14 np0005548916 python3.9[190036]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/servercert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:56:15 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:56:15 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:56:15 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:56:15.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:56:15 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:56:15 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 04:56:15 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:56:15.309 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 04:56:15 np0005548916 python3.9[190188]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/serverkey.pem group=root mode=0600 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:56:15 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:15 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c004a20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:15 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:15 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c004a20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:15 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:15 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 04:56:16 np0005548916 python3.9[190340]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/clientcert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:56:16 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:16 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e8004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:16 np0005548916 python3.9[190493]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/clientkey.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:56:17 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:56:17 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:56:17 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:56:17.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:56:17 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:56:17 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:56:17 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:56:17.312 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:56:17 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:17 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:17 np0005548916 python3.9[190645]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/CA/cacert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:56:17 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:17 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:18 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:56:18 np0005548916 python3.9[190800]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:56:18 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:18 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2180012b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:18 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:18 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 04:56:18 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:18 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 04:56:19 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:56:19 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:56:19 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:56:19.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:56:19 np0005548916 python3.9[190952]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:56:19 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:56:19 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:56:19 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:56:19.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:56:19 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:19 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e8004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:19 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:19 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f0001090 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:19 np0005548916 python3.9[191104]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:56:20 np0005548916 python3.9[191257]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:56:20 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:20 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:20 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/095620 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  6 04:56:21 np0005548916 python3.9[191409]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/ca-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:56:21 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:56:21 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:56:21 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:56:21.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:56:21 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:56:21 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:56:21 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:56:21.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:56:21 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:21 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2180012b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:21 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:21 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e8004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:22 np0005548916 podman[191561]: 2025-12-06 09:56:22.356116776 +0000 UTC m=+0.113497436 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec  6 04:56:22 np0005548916 python3.9[191563]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  6 04:56:22 np0005548916 systemd[1]: Reloading.
Dec  6 04:56:22 np0005548916 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:56:22 np0005548916 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:56:22 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:22 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f0001fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:22 np0005548916 systemd[1]: Starting libvirt logging daemon socket...
Dec  6 04:56:22 np0005548916 systemd[1]: Listening on libvirt logging daemon socket.
Dec  6 04:56:22 np0005548916 systemd[1]: Starting libvirt logging daemon admin socket...
Dec  6 04:56:22 np0005548916 systemd[1]: Listening on libvirt logging daemon admin socket.
Dec  6 04:56:22 np0005548916 systemd[1]: Starting libvirt logging daemon...
Dec  6 04:56:23 np0005548916 systemd[1]: Started libvirt logging daemon.
Dec  6 04:56:23 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:56:23 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:56:23 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:56:23.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:56:23 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:56:23 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:56:23 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:56:23.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:56:23 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:56:23 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:23 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:23 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:23 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2180012b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:23 np0005548916 python3.9[191782]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  6 04:56:23 np0005548916 systemd[1]: Reloading.
Dec  6 04:56:23 np0005548916 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:56:23 np0005548916 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:56:24 np0005548916 systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Dec  6 04:56:24 np0005548916 systemd[1]: Starting libvirt nodedev daemon socket...
Dec  6 04:56:24 np0005548916 systemd[1]: Listening on libvirt nodedev daemon socket.
Dec  6 04:56:24 np0005548916 systemd[1]: Starting libvirt nodedev daemon admin socket...
Dec  6 04:56:24 np0005548916 systemd[1]: Starting libvirt nodedev daemon read-only socket...
Dec  6 04:56:24 np0005548916 systemd[1]: Listening on libvirt nodedev daemon read-only socket.
Dec  6 04:56:24 np0005548916 systemd[1]: Listening on libvirt nodedev daemon admin socket.
Dec  6 04:56:24 np0005548916 systemd[1]: Starting libvirt nodedev daemon...
Dec  6 04:56:24 np0005548916 systemd[1]: Started libvirt nodedev daemon.
Dec  6 04:56:24 np0005548916 systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Dec  6 04:56:24 np0005548916 systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged.
Dec  6 04:56:24 np0005548916 systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service.
Dec  6 04:56:24 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:24 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e8004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:25 np0005548916 python3.9[192006]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  6 04:56:25 np0005548916 systemd[1]: Reloading.
Dec  6 04:56:25 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:56:25 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:56:25 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:56:25.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:56:25 np0005548916 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:56:25 np0005548916 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:56:25 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:56:25 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:56:25 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:56:25.327 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:56:25 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:25 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f0001fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:25 np0005548916 systemd[1]: Starting libvirt proxy daemon admin socket...
Dec  6 04:56:25 np0005548916 systemd[1]: Starting libvirt proxy daemon read-only socket...
Dec  6 04:56:25 np0005548916 systemd[1]: Listening on libvirt proxy daemon admin socket.
Dec  6 04:56:25 np0005548916 systemd[1]: Listening on libvirt proxy daemon read-only socket.
Dec  6 04:56:25 np0005548916 systemd[1]: Starting libvirt proxy daemon...
Dec  6 04:56:25 np0005548916 setroubleshoot[191818]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 0aa677db-8f04-42d4-9355-b01c7fb3c0b5
Dec  6 04:56:25 np0005548916 systemd[1]: Started libvirt proxy daemon.
Dec  6 04:56:25 np0005548916 setroubleshoot[191818]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.#012#012*****  Plugin dac_override (91.4 confidence) suggests   **********************#012#012If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system#012Then turn on full auditing to get path information about the offending file and generate the error again.#012Do#012#012Turn on full auditing#012# auditctl -w /etc/shadow -p w#012Try to recreate AVC. Then execute#012# ausearch -m avc -ts recent#012If you see PATH record check ownership/permissions on file, and fix it,#012otherwise report as a bugzilla.#012#012*****  Plugin catchall (9.59 confidence) suggests   **************************#012#012If you believe that virtlogd should have the dac_read_search capability by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd#012# semodule -X 300 -i my-virtlogd.pp#012
Dec  6 04:56:25 np0005548916 setroubleshoot[191818]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 0aa677db-8f04-42d4-9355-b01c7fb3c0b5
Dec  6 04:56:25 np0005548916 setroubleshoot[191818]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.#012#012*****  Plugin dac_override (91.4 confidence) suggests   **********************#012#012If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system#012Then turn on full auditing to get path information about the offending file and generate the error again.#012Do#012#012Turn on full auditing#012# auditctl -w /etc/shadow -p w#012Try to recreate AVC. Then execute#012# ausearch -m avc -ts recent#012If you see PATH record check ownership/permissions on file, and fix it,#012otherwise report as a bugzilla.#012#012*****  Plugin catchall (9.59 confidence) suggests   **************************#012#012If you believe that virtlogd should have the dac_read_search capability by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd#012# semodule -X 300 -i my-virtlogd.pp#012
Dec  6 04:56:25 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:25 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2180012b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:26 np0005548916 python3.9[192220]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  6 04:56:26 np0005548916 systemd[1]: Reloading.
Dec  6 04:56:26 np0005548916 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:56:26 np0005548916 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:56:26 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:26 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:26 np0005548916 systemd[1]: Listening on libvirt locking daemon socket.
Dec  6 04:56:26 np0005548916 systemd[1]: Starting libvirt QEMU daemon socket...
Dec  6 04:56:26 np0005548916 systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw).
Dec  6 04:56:26 np0005548916 systemd[1]: Starting Virtual Machine and Container Registration Service...
Dec  6 04:56:26 np0005548916 systemd[1]: Listening on libvirt QEMU daemon socket.
Dec  6 04:56:26 np0005548916 systemd[1]: Starting libvirt QEMU daemon admin socket...
Dec  6 04:56:26 np0005548916 systemd[1]: Starting libvirt QEMU daemon read-only socket...
Dec  6 04:56:26 np0005548916 systemd[1]: Listening on libvirt QEMU daemon admin socket.
Dec  6 04:56:26 np0005548916 systemd[1]: Listening on libvirt QEMU daemon read-only socket.
Dec  6 04:56:26 np0005548916 systemd[1]: Started Virtual Machine and Container Registration Service.
Dec  6 04:56:26 np0005548916 systemd[1]: Starting libvirt QEMU daemon...
Dec  6 04:56:26 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:26 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 04:56:27 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:27 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 04:56:27 np0005548916 systemd[1]: Started libvirt QEMU daemon.
Dec  6 04:56:27 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:56:27 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:56:27 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:56:27.183 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:56:27 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:56:27 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:56:27 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:56:27.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:56:27 np0005548916 podman[192406]: 2025-12-06 09:56:27.555588207 +0000 UTC m=+0.078642915 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec  6 04:56:27 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:27 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e8004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:27 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:27 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e8004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:27 np0005548916 python3.9[192453]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  6 04:56:27 np0005548916 systemd[1]: Reloading.
Dec  6 04:56:27 np0005548916 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:56:27 np0005548916 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:56:28 np0005548916 systemd[1]: Starting libvirt secret daemon socket...
Dec  6 04:56:28 np0005548916 systemd[1]: Listening on libvirt secret daemon socket.
Dec  6 04:56:28 np0005548916 systemd[1]: Starting libvirt secret daemon admin socket...
Dec  6 04:56:28 np0005548916 systemd[1]: Starting libvirt secret daemon read-only socket...
Dec  6 04:56:28 np0005548916 systemd[1]: Listening on libvirt secret daemon read-only socket.
Dec  6 04:56:28 np0005548916 systemd[1]: Listening on libvirt secret daemon admin socket.
Dec  6 04:56:28 np0005548916 systemd[1]: Starting libvirt secret daemon...
Dec  6 04:56:28 np0005548916 systemd[1]: Started libvirt secret daemon.
Dec  6 04:56:28 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:56:28 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:28 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2180012b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:29 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:56:29 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:56:29 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:56:29.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:56:29 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:56:29 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:56:29 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:56:29.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:56:29 np0005548916 python3.9[192691]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:56:29 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:29 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:29 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:29 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e8004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:30 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:30 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  6 04:56:30 np0005548916 python3.9[192843]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec  6 04:56:30 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:30 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f0002a80 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:31 np0005548916 python3.9[192996]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail;#012echo ceph#012awk -F '=' '/fsid/ {print $2}' /var/lib/openstack/config/ceph/ceph.conf | xargs#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:56:31 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:56:31 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:56:31 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:56:31.190 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:56:31 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:56:31 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:56:31 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:56:31.337 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:56:31 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:31 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2180012b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:31 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:31 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:31 np0005548916 python3.9[193150]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.keyring'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec  6 04:56:32 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:32 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e8004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:32 np0005548916 python3.9[193301]: ansible-ansible.legacy.stat Invoked with path=/tmp/secret.xml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:56:33 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:33 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 04:56:33 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:56:33 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:56:33 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:56:33.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:56:33 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:56:33 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:56:33 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:56:33.341 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:56:33 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:56:33 np0005548916 python3.9[193422]: ansible-ansible.legacy.copy Invoked with dest=/tmp/secret.xml mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1765014992.3877494-3360-250846414704821/.source.xml follow=False _original_basename=secret.xml.j2 checksum=f7c948a7651e1e704e9fb6c67bea136c2b7876ec backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:56:33 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:33 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f0002a80 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:33 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:33 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2180012b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:34 np0005548916 python3.9[193574]: ansible-ansible.legacy.command Invoked with _raw_params=virsh secret-undefine 5ecd3f74-dade-5fc4-92ce-8950ae424258#012virsh secret-define --file /tmp/secret.xml#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:56:34 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:34 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:35 np0005548916 python3.9[193737]: ansible-ansible.builtin.file Invoked with path=/tmp/secret.xml state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:56:35 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:56:35 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:56:35 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:56:35.196 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:56:35 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:56:35 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:56:35 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:56:35.345 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:56:35 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:35 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e8004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:35 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/095635 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  6 04:56:35 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:35 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f0002a80 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:35 np0005548916 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully.
Dec  6 04:56:35 np0005548916 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Consumed 1.044s CPU time.
Dec  6 04:56:35 np0005548916 systemd[1]: setroubleshootd.service: Deactivated successfully.
Dec  6 04:56:36 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:36 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 04:56:36 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:36 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 04:56:36 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:36 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f0002a80 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:37 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:56:37 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:56:37 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:56:37.200 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:56:37 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:56:37 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:56:37 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:56:37.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:56:37 np0005548916 python3.9[194201]: ansible-ansible.legacy.copy Invoked with dest=/etc/ceph/ceph.conf group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/config/ceph/ceph.conf backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:56:37 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:37 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:37 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:37 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e8004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:38 np0005548916 python3.9[194353]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:56:38 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:56:38 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:38 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f0002a80 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:38 np0005548916 python3.9[194477]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1765014997.8608117-3525-221189080822303/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=5ca83b1310a74c5e48c4c3d4640e1cb8fdac1061 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:56:39 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:39 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  6 04:56:39 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:56:39 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:56:39 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:56:39.204 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:56:39 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:56:39 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:56:39 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:56:39.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:56:39 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:39 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f0002a80 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:39 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:39 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:39 np0005548916 python3.9[194629]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:56:40 np0005548916 python3.9[194782]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:56:40 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:40 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e8004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:41 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:56:41 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:56:41 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:56:41.207 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:56:41 np0005548916 python3.9[194860]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:56:41 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:56:41 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:56:41 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:56:41.354 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:56:41 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:41 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f0002a80 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:41 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:41 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2180012b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:41 np0005548916 python3.9[195012]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:56:42 np0005548916 python3.9[195091]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.hnm4iwsp recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:56:42 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:42 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:42 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/095642 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  6 04:56:43 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:56:43 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:56:43 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:56:43.211 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:56:43 np0005548916 python3.9[195243]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:56:43 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:56:43 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:56:43 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:56:43.357 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:56:43 np0005548916 ceph-mon[79770]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #31. Immutable memtables: 0.
Dec  6 04:56:43 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:56:43.372964) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec  6 04:56:43 np0005548916 ceph-mon[79770]: rocksdb: [db/flush_job.cc:856] [default] [JOB 15] Flushing memtable with next log file: 31
Dec  6 04:56:43 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015003373131, "job": 15, "event": "flush_started", "num_memtables": 1, "num_entries": 542, "num_deletes": 251, "total_data_size": 892969, "memory_usage": 903152, "flush_reason": "Manual Compaction"}
Dec  6 04:56:43 np0005548916 ceph-mon[79770]: rocksdb: [db/flush_job.cc:885] [default] [JOB 15] Level-0 flush table #32: started
Dec  6 04:56:43 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015003380179, "cf_name": "default", "job": 15, "event": "table_file_creation", "file_number": 32, "file_size": 589687, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 17606, "largest_seqno": 18143, "table_properties": {"data_size": 586862, "index_size": 861, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 901, "raw_key_size": 6542, "raw_average_key_size": 18, "raw_value_size": 581332, "raw_average_value_size": 1665, "num_data_blocks": 39, "num_entries": 349, "num_filter_entries": 349, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765014971, "oldest_key_time": 1765014971, "file_creation_time": 1765015003, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79d3ff52-4bd2-4fdc-8b55-33dd9c53d8e0", "db_session_id": "1TK25AVRA1WQDS4JHM8T", "orig_file_number": 32, "seqno_to_time_mapping": "N/A"}}
Dec  6 04:56:43 np0005548916 ceph-mon[79770]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 15] Flush lasted 7231 microseconds, and 3707 cpu microseconds.
Dec  6 04:56:43 np0005548916 ceph-mon[79770]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  6 04:56:43 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:56:43.380225) [db/flush_job.cc:967] [default] [JOB 15] Level-0 flush table #32: 589687 bytes OK
Dec  6 04:56:43 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:56:43.380242) [db/memtable_list.cc:519] [default] Level-0 commit table #32 started
Dec  6 04:56:43 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:56:43.381553) [db/memtable_list.cc:722] [default] Level-0 commit table #32: memtable #1 done
Dec  6 04:56:43 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:56:43.381566) EVENT_LOG_v1 {"time_micros": 1765015003381562, "job": 15, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec  6 04:56:43 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:56:43.381586) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec  6 04:56:43 np0005548916 ceph-mon[79770]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 15] Try to delete WAL files size 889836, prev total WAL file size 889836, number of live WAL files 2.
Dec  6 04:56:43 np0005548916 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000028.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 04:56:43 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:56:43.382222) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031303034' seq:72057594037927935, type:22 .. '7061786F730031323536' seq:0, type:0; will stop at (end)
Dec  6 04:56:43 np0005548916 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 16] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec  6 04:56:43 np0005548916 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 15 Base level 0, inputs: [32(575KB)], [30(13MB)]
Dec  6 04:56:43 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015003382346, "job": 16, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [32], "files_L6": [30], "score": -1, "input_data_size": 14526008, "oldest_snapshot_seqno": -1}
Dec  6 04:56:43 np0005548916 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 16] Generated table #33: 4870 keys, 12334259 bytes, temperature: kUnknown
Dec  6 04:56:43 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015003472814, "cf_name": "default", "job": 16, "event": "table_file_creation", "file_number": 33, "file_size": 12334259, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12300996, "index_size": 19969, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 12229, "raw_key_size": 123201, "raw_average_key_size": 25, "raw_value_size": 12211804, "raw_average_value_size": 2507, "num_data_blocks": 830, "num_entries": 4870, "num_filter_entries": 4870, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765014012, "oldest_key_time": 0, "file_creation_time": 1765015003, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79d3ff52-4bd2-4fdc-8b55-33dd9c53d8e0", "db_session_id": "1TK25AVRA1WQDS4JHM8T", "orig_file_number": 33, "seqno_to_time_mapping": "N/A"}}
Dec  6 04:56:43 np0005548916 ceph-mon[79770]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  6 04:56:43 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:56:43.473181) [db/compaction/compaction_job.cc:1663] [default] [JOB 16] Compacted 1@0 + 1@6 files to L6 => 12334259 bytes
Dec  6 04:56:43 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:56:43.474717) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 160.4 rd, 136.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.6, 13.3 +0.0 blob) out(11.8 +0.0 blob), read-write-amplify(45.6) write-amplify(20.9) OK, records in: 5380, records dropped: 510 output_compression: NoCompression
Dec  6 04:56:43 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:56:43.474738) EVENT_LOG_v1 {"time_micros": 1765015003474728, "job": 16, "event": "compaction_finished", "compaction_time_micros": 90567, "compaction_time_cpu_micros": 27597, "output_level": 6, "num_output_files": 1, "total_output_size": 12334259, "num_input_records": 5380, "num_output_records": 4870, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec  6 04:56:43 np0005548916 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000032.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 04:56:43 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015003474967, "job": 16, "event": "table_file_deletion", "file_number": 32}
Dec  6 04:56:43 np0005548916 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000030.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 04:56:43 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015003477527, "job": 16, "event": "table_file_deletion", "file_number": 30}
Dec  6 04:56:43 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:56:43.382083) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 04:56:43 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:56:43.477658) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 04:56:43 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:56:43.477668) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 04:56:43 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:56:43.477671) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 04:56:43 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:56:43.477674) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 04:56:43 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:56:43.477677) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 04:56:43 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:56:43 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:43 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:43 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:43 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f0002a80 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:43 np0005548916 python3.9[195321]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:56:44 np0005548916 python3.9[195474]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:56:44 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:44 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f0002a80 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:45 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:56:45 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:56:45 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:56:45.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:56:45 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:56:45 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:56:45 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:56:45.361 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:56:45 np0005548916 python3[195627]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Dec  6 04:56:45 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:45 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e8004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:45 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:45 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f0002a80 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:46 np0005548916 python3.9[195779]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:56:46 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:46 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e8004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:46 np0005548916 python3.9[195858]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:56:47 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:56:47 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:56:47 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:56:47.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:56:47 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:56:47 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:56:47 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:56:47.362 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:56:47 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:47 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:47 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:47 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f0002a80 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:48 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:56:48 np0005548916 python3.9[196012]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:56:48 np0005548916 auditd[703]: Audit daemon rotating log files
Dec  6 04:56:48 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:48 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f0002a80 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:49 np0005548916 python3.9[196117]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:56:49 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:56:49 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:56:49 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:56:49.220 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:56:49 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:56:49 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:56:49 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:56:49.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:56:49 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:49 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f0002a80 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:49 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:49 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c002780 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:49 np0005548916 python3.9[196269]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:56:50 np0005548916 python3.9[196347]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:56:50 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:50 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2140025c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:51 np0005548916 python3.9[196500]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:56:51 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:56:51 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:56:51 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:56:51.223 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:56:51 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:56:51 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 04:56:51 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:56:51.368 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 04:56:51 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:51 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2140025c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:51 np0005548916 python3.9[196578]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:56:51 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:51 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f00042e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:52 np0005548916 python3.9[196756]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:56:52 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:52 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c002780 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:52 np0005548916 podman[196848]: 2025-12-06 09:56:52.800002989 +0000 UTC m=+0.097181359 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  6 04:56:53 np0005548916 python3.9[196964]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765015011.925789-3900-41308981290846/.source.nft follow=False _original_basename=ruleset.j2 checksum=ac3ce8ce2d33fa5fe0a79b0c811c97734ce43fa5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:56:53 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:56:53 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:56:53 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:56:53.226 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:56:53 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:56:53 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:56:53 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:56:53.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:56:53 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  6 04:56:53 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:56:53 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:56:53 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  6 04:56:53 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:56:53 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:53 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e8004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:53 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:53 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2140025c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:54 np0005548916 python3.9[197116]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:56:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:56:54.266 141446 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 04:56:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:56:54.268 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 04:56:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:56:54.268 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 04:56:54 np0005548916 python3.9[197269]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:56:54 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:54 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f00042e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:55 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:56:55 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:56:55 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:56:55.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:56:55 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:56:55 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:56:55 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:56:55.375 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:56:55 np0005548916 python3.9[197424]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:56:55 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:55 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c003160 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:55 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:55 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e8004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:56 np0005548916 python3.9[197576]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:56:56 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:56 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2140037d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:57 np0005548916 python3.9[197730]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  6 04:56:57 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:56:57 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:56:57 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:56:57.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:56:57 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:56:57 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:56:57 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:56:57.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:56:57 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:57 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f00042e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:57 np0005548916 podman[197832]: 2025-12-06 09:56:57.758691153 +0000 UTC m=+0.058491972 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent)
Dec  6 04:56:57 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:57 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c003160 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:58 np0005548916 python3.9[197903]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:56:58 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:56:58 np0005548916 python3.9[198084]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:56:58 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:58 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e8004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:59 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:56:59 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:56:59 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:56:59 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:56:59 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:56:59.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:56:59 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:56:59 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:56:59 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:56:59.381 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:56:59 np0005548916 python3.9[198236]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:56:59 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:59 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2140037d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:59 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:59 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f00042e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:00 np0005548916 python3.9[198359]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765015019.058871-4116-133551360775595/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:57:00 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:00 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c003e70 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:00 np0005548916 python3.9[198512]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:57:01 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:57:01 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:57:01 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:57:01.238 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:57:01 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:57:01 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:57:01 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:57:01.386 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:57:01 np0005548916 python3.9[198635]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765015020.4203186-4162-158177263821505/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:57:01 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:01 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e8004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:01 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:01 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2140037d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:02 np0005548916 python3.9[198787]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:57:02 np0005548916 python3.9[198911]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765015021.6784155-4206-79333795060797/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:57:02 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:02 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f00042e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:03 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:57:03 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:57:03 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:57:03.240 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:57:03 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:57:03 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:57:03 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:57:03.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:57:03 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:57:03 np0005548916 python3.9[199063]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  6 04:57:03 np0005548916 systemd[1]: Reloading.
Dec  6 04:57:03 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:03 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c003e70 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:03 np0005548916 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:57:03 np0005548916 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:57:03 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:03 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e8004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:03 np0005548916 systemd[1]: Reached target edpm_libvirt.target.
Dec  6 04:57:04 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:04 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e8004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:04 np0005548916 python3.9[199255]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Dec  6 04:57:04 np0005548916 systemd[1]: Reloading.
Dec  6 04:57:04 np0005548916 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:57:04 np0005548916 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:57:05 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:57:05 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:57:05 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:57:05.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:57:05 np0005548916 systemd[1]: Reloading.
Dec  6 04:57:05 np0005548916 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:57:05 np0005548916 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:57:05 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:57:05 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:57:05 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:57:05.392 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:57:05 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:05 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f00042e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:05 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:05 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c003e70 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:06 np0005548916 systemd[1]: session-52.scope: Deactivated successfully.
Dec  6 04:57:06 np0005548916 systemd[1]: session-52.scope: Consumed 3min 45.764s CPU time.
Dec  6 04:57:06 np0005548916 systemd-logind[788]: Session 52 logged out. Waiting for processes to exit.
Dec  6 04:57:06 np0005548916 systemd-logind[788]: Removed session 52.
Dec  6 04:57:06 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:06 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2140037d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:07 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:57:07 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:57:07 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:57:07.246 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:57:07 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:57:07 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:57:07 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:57:07.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:57:07 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:07 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2140037d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:07 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:07 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f00042e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:08 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:57:08 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:08 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f00042e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:09 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:57:09 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:57:09 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:57:09.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:57:09 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:57:09 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:57:09 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:57:09.398 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:57:09 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:09 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f00042e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:09 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:09 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2140037d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:10 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:10 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:11 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:57:11 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:57:11 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:57:11.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:57:11 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:57:11 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:57:11 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:57:11.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:57:11 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:11 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c004f70 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:11 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:11 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f00042e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:11 np0005548916 systemd-logind[788]: New session 53 of user zuul.
Dec  6 04:57:11 np0005548916 systemd[1]: Started Session 53 of User zuul.
Dec  6 04:57:12 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:12 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2140037d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:12 np0005548916 python3.9[199533]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  6 04:57:13 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:57:13 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:57:13 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:57:13.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:57:13 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:57:13 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:57:13 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:57:13.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:57:13 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:57:13 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:13 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0003c30 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:13 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:13 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c004f70 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:14 np0005548916 python3.9[199687]: ansible-ansible.builtin.service_facts Invoked
Dec  6 04:57:14 np0005548916 network[199705]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec  6 04:57:14 np0005548916 network[199706]: 'network-scripts' will be removed from distribution in near future.
Dec  6 04:57:14 np0005548916 network[199707]: It is advised to switch to 'NetworkManager' instead for network management.
Dec  6 04:57:14 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:14 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f00042e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:15 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:57:15 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:57:15 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:57:15.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:57:15 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:57:15 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:57:15 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:57:15.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:57:15 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:15 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2140037d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:15 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:15 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0003c50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:16 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:16 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c004f70 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:17 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:57:17 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:57:17 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:57:17.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:57:17 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:57:17 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:57:17 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:57:17.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:57:17 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:17 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f00042e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:17 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:17 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2140037d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:18 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:57:18 np0005548916 python3.9[199981]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  6 04:57:18 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:18 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0003c70 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:19 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:57:19 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:57:19 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:57:19.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:57:19 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:57:19 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:57:19 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:57:19.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:57:19 np0005548916 python3.9[200065]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  6 04:57:19 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:19 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c004f70 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:19 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:19 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f00042e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:20 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:20 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2140037d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:21 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:57:21 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:57:21 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:57:21.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:57:21 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:57:21 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:57:21 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:57:21.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:57:21 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:21 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2180022a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:21 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:21 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c004f70 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:22 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:22 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f00042e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:23 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:57:23 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:57:23 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:57:23.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:57:23 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:57:23 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:57:23 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:57:23.420 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:57:23 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:57:23 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:23 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2140037d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:23 np0005548916 podman[200070]: 2025-12-06 09:57:23.818647211 +0000 UTC m=+0.119465786 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Dec  6 04:57:23 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:23 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2180022a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:24 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:24 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c004f70 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:25 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:57:25 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:57:25 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:57:25.276 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:57:25 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:57:25 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:57:25 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:57:25.423 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:57:25 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:25 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f00042e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:25 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:25 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2140037d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:26 np0005548916 python3.9[200248]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  6 04:57:26 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:26 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2180022a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:27 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:57:27 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:57:27 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:57:27.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:57:27 np0005548916 python3.9[200401]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:57:27 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:57:27 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:57:27 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:57:27.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:57:27 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:27 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2180022a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:27 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:27 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f00042e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:28 np0005548916 podman[200526]: 2025-12-06 09:57:28.13175862 +0000 UTC m=+0.065037937 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Dec  6 04:57:28 np0005548916 python3.9[200572]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  6 04:57:28 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:57:28 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:28 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2140037d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:29 np0005548916 python3.9[200727]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/iscsi-iname _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:57:29 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:57:29 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:57:29 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:57:29.283 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:57:29 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:57:29 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:57:29 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:57:29.427 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:57:29 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:29 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2140037d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:29 np0005548916 python3.9[200905]: ansible-ansible.legacy.stat Invoked with path=/etc/iscsi/initiatorname.iscsi follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:57:29 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:29 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2140037d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:30 np0005548916 python3.9[201029]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi/initiatorname.iscsi mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765015049.3007045-246-26049286193805/.source.iscsi _original_basename=.r3uie0hx follow=False checksum=eaccd56aaf590b98db17b6975888b71367194346 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:57:30 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:30 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f00042e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:31 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:57:31 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:57:31 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:57:31.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:57:31 np0005548916 python3.9[201181]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.initiator_reset state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:57:31 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:57:31 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:57:31 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:57:31.430 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:57:31 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:31 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c005110 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:31 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:31 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2180089d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:32 np0005548916 python3.9[201333]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:57:32 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:32 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2140037d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:33 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:57:33 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:57:33 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:57:33.289 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:57:33 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:57:33 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:57:33 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:57:33.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:57:33 np0005548916 python3.9[201486]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  6 04:57:33 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:57:33 np0005548916 systemd[1]: Listening on Open-iSCSI iscsid Socket.
Dec  6 04:57:33 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:33 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f00042e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:33 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:33 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c005110 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:34 np0005548916 python3.9[201642]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  6 04:57:34 np0005548916 systemd[1]: Reloading.
Dec  6 04:57:34 np0005548916 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:57:34 np0005548916 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:57:34 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:34 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2180089d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:34 np0005548916 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Dec  6 04:57:34 np0005548916 systemd[1]: Starting Open-iSCSI...
Dec  6 04:57:34 np0005548916 kernel: Loading iSCSI transport class v2.0-870.
Dec  6 04:57:34 np0005548916 systemd[1]: Started Open-iSCSI.
Dec  6 04:57:34 np0005548916 systemd[1]: Starting Logout off all iSCSI sessions on shutdown...
Dec  6 04:57:34 np0005548916 systemd[1]: Finished Logout off all iSCSI sessions on shutdown.
Dec  6 04:57:35 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:57:35 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:57:35 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:57:35.292 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:57:35 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:57:35 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:57:35 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:57:35.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:57:35 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:35 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2180089d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:35 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:35 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f00042e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:35 np0005548916 python3.9[201844]: ansible-ansible.builtin.service_facts Invoked
Dec  6 04:57:36 np0005548916 network[201861]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec  6 04:57:36 np0005548916 network[201862]: 'network-scripts' will be removed from distribution in near future.
Dec  6 04:57:36 np0005548916 network[201863]: It is advised to switch to 'NetworkManager' instead for network management.
Dec  6 04:57:36 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:36 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c005110 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:37 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:57:37 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:57:37 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:57:37.294 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:57:37 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:57:37 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:57:37 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:57:37.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:57:37 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:37 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2140037d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:37 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:37 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2140037d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:38 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:57:38 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:38 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f00042e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:38 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/095738 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  6 04:57:39 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:57:39 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:57:39 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:57:39.298 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:57:39 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:57:39 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:57:39 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:57:39.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:57:39 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:39 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c005110 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:39 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:39 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2180089d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:40 np0005548916 python3.9[202138]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Dec  6 04:57:40 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:40 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e8001bd0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:41 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:57:41 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:57:41 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:57:41.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:57:41 np0005548916 python3.9[202291]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Dec  6 04:57:41 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:57:41 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:57:41 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:57:41.446 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:57:41 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:41 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f00042e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:41 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:41 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c005110 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:42 np0005548916 python3.9[202447]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:57:42 np0005548916 python3.9[202571]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765015061.6562243-477-13239230047815/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:57:42 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:42 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2180089d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:43 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:57:43 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:57:43 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:57:43.303 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:57:43 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:57:43 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:57:43 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:57:43.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:57:43 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:57:43 np0005548916 python3.9[202723]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:57:43 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:43 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e8001bd0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:43 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:43 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f00042e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:44 np0005548916 python3.9[202876]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  6 04:57:44 np0005548916 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Dec  6 04:57:44 np0005548916 systemd[1]: Stopped Load Kernel Modules.
Dec  6 04:57:44 np0005548916 systemd[1]: Stopping Load Kernel Modules...
Dec  6 04:57:44 np0005548916 systemd[1]: Starting Load Kernel Modules...
Dec  6 04:57:44 np0005548916 systemd[1]: Finished Load Kernel Modules.
Dec  6 04:57:44 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:44 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c005110 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:45 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:57:45 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:57:45 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:57:45.306 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:57:45 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:57:45 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:57:45 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:57:45.452 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:57:45 np0005548916 python3.9[203032]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:57:45 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:45 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2180089d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:45 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:45 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e8001bd0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:46 np0005548916 python3.9[203185]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  6 04:57:46 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:46 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f00042e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:47 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:57:47 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:57:47 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:57:47.309 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:57:47 np0005548916 python3.9[203337]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  6 04:57:47 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:57:47 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:57:47 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:57:47.455 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:57:47 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:47 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c005110 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:47 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:47 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2180089d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:48 np0005548916 python3.9[203489]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:57:48 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:48 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 04:57:48 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:57:48 np0005548916 python3.9[203613]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765015067.631713-651-67470341557495/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:57:48 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:48 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e8002e70 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:49 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:57:49 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:57:49 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:57:49.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:57:49 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:57:49 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:57:49 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:57:49.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:57:49 np0005548916 python3.9[203790]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:57:49 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:49 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f00042e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:49 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:49 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c005110 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:50 np0005548916 python3.9[203944]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:57:50 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:50 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2180089d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:51 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:57:51 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:57:51 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:57:51.315 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:57:51 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:51 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 04:57:51 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:51 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 04:57:51 np0005548916 python3.9[204096]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:57:51 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:57:51 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:57:51 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:57:51.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:57:51 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:51 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e8002e70 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:51 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:51 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f00042e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:52 np0005548916 python3.9[204249]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:57:52 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:52 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0002550 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:52 np0005548916 python3.9[204402]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:57:53 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:57:53 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:57:53 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:57:53.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:57:53 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:57:53 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:57:53 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:57:53.463 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:57:53 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:57:53 np0005548916 python3.9[204554]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:57:53 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:53 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2180089d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:53 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:53 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e8002e70 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:57:54.267 141446 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 04:57:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:57:54.268 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 04:57:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:57:54.269 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 04:57:54 np0005548916 podman[204678]: 2025-12-06 09:57:54.305034219 +0000 UTC m=+0.122280545 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec  6 04:57:54 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:54 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  6 04:57:54 np0005548916 python3.9[204717]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:57:54 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:54 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f00042e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:55 np0005548916 python3.9[204887]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:57:55 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:57:55 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:57:55 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:57:55.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:57:55 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:57:55 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:57:55 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:57:55.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:57:55 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:55 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0002550 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:55 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:55 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2180089d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:56 np0005548916 python3.9[205039]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  6 04:57:56 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:56 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e8002e70 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:56 np0005548916 python3.9[205194]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/multipath/.multipath_restart_required state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:57:57 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:57:57 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:57:57 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:57:57.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:57:57 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:57:57 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:57:57 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:57:57.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:57:57 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:57 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f00042e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:57 np0005548916 python3.9[205346]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:57:57 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:57 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0002550 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:58 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:57:58 np0005548916 podman[205427]: 2025-12-06 09:57:58.592340384 +0000 UTC m=+0.052440182 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Dec  6 04:57:58 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:58 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2180089d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:58 np0005548916 python3.9[205570]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:57:59 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:57:59 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:57:59 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:57:59.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:57:59 np0005548916 python3.9[205676]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:57:59 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:57:59 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:57:59 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:57:59.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:57:59 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:59 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e8002e70 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:59 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:59 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f00042e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:00 np0005548916 python3.9[205828]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:58:00 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  6 04:58:00 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:58:00 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:58:00 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  6 04:58:00 np0005548916 python3.9[205907]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:58:00 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:00 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0002550 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:00 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/095800 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 1ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  6 04:58:01 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:58:01 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.003000075s ======
Dec  6 04:58:01 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:58:01.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.003000075s
Dec  6 04:58:01 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:58:01 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:58:01 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:58:01.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:58:01 np0005548916 python3.9[206059]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:58:01 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:01 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2180089d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:01 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:01 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e8002e70 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:02 np0005548916 python3.9[206211]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:58:02 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:02 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f00042e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:03 np0005548916 python3.9[206290]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:58:03 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:58:03 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:58:03 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:58:03.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:58:03 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:58:03 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:58:03 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:58:03.479 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:58:03 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:58:03 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:03 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0002550 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:03 np0005548916 python3.9[206442]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:58:03 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:03 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2180089d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:04 np0005548916 python3.9[206520]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:58:04 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:04 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e8002e70 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:05 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:58:05 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:58:05 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:58:05.332 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:58:05 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:58:05 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:58:05 np0005548916 python3.9[206673]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  6 04:58:05 np0005548916 systemd[1]: Reloading.
Dec  6 04:58:05 np0005548916 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:58:05 np0005548916 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:58:05 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:58:05 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:58:05 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:58:05.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:58:05 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:05 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f00042e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:05 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:05 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0002550 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:06 np0005548916 python3.9[206887]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:58:06 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:06 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2180089d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:07 np0005548916 python3.9[206965]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:58:07 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:58:07 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:58:07 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:58:07.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:58:07 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:58:07 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:58:07 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:58:07.487 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:58:07 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:07 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e8002e70 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:07 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:07 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f00042e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:08 np0005548916 python3.9[207117]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:58:08 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:58:08 np0005548916 python3.9[207196]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:58:08 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:08 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0004080 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:09 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:58:09 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:58:09 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:58:09.334 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:58:09 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:58:09 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:58:09 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:58:09.488 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:58:09 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:09 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2180089d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:09 np0005548916 python3.9[207373]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  6 04:58:09 np0005548916 systemd[1]: Reloading.
Dec  6 04:58:09 np0005548916 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:58:09 np0005548916 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:58:09 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:09 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e8002e70 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:10 np0005548916 systemd[1]: Starting Create netns directory...
Dec  6 04:58:10 np0005548916 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec  6 04:58:10 np0005548916 systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec  6 04:58:10 np0005548916 systemd[1]: Finished Create netns directory.
Dec  6 04:58:10 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:10 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f00042e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:11 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:58:11 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:58:11 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:58:11.337 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:58:11 np0005548916 python3.9[207569]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:58:11 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:58:11 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:58:11 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:58:11.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:58:11 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:11 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0004080 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:11 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:11 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb214001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:12 np0005548916 python3.9[207721]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/multipathd/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:58:12 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:12 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2180089d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:12 np0005548916 python3.9[207845]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/multipathd/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765015091.7797637-1272-180924122192738/.source _original_basename=healthcheck follow=False checksum=af9d0c1c8f3cb0e30ce9609be9d5b01924d0d23f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:58:13 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:58:13 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:58:13 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:58:13.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:58:13 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:58:13 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:58:13 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:58:13.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:58:13 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:58:13 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:13 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2180089d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:13 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:13 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2180089d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:14 np0005548916 python3.9[207997]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:58:14 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:14 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb214001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:14 np0005548916 python3.9[208150]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/multipathd.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:58:15 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:58:15 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:58:15 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:58:15.341 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:58:15 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:58:15 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:58:15 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:58:15.497 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:58:15 np0005548916 python3.9[208273]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/multipathd.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1765015094.4522076-1347-5478813197091/.source.json _original_basename=.5mlf3od5 follow=False checksum=3f7959ee8ac9757398adcc451c3b416c957d7c14 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:58:15 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:15 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f00042e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:15 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:15 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f00042e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:16 np0005548916 python3.9[208425]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/multipathd state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:58:16 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:16 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2180089d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:17 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:58:17 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:58:17 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:58:17.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:58:17 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:58:17 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:58:17 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:58:17.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:58:17 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:17 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2180089d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:17 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:17 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f00042e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:18 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:58:18 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:18 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2180089d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:19 np0005548916 python3.9[208854]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/multipathd config_pattern=*.json debug=False
Dec  6 04:58:19 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:58:19 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:58:19 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:58:19.345 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:58:19 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:58:19 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 04:58:19 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:58:19.504 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 04:58:19 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:19 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0004080 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:19 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:19 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2140037d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:20 np0005548916 python3.9[209006]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec  6 04:58:20 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:20 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f00042e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:21 np0005548916 python3.9[209159]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Dec  6 04:58:21 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:58:21 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:58:21 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:58:21.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:58:21 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:58:21 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:58:21 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:58:21.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:58:21 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:21 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2180089d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:21 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:21 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2180089d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:22 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:22 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2140037d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:23 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:58:23 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:58:23 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:58:23.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:58:23 np0005548916 python3[209339]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/multipathd config_id=multipathd config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Dec  6 04:58:23 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:58:23 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:58:23 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:58:23 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:58:23.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:58:23 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:23 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f0004300 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:23 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:23 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c003690 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:24 np0005548916 systemd[1]: virtnodedevd.service: Deactivated successfully.
Dec  6 04:58:24 np0005548916 podman[209354]: 2025-12-06 09:58:24.530125812 +0000 UTC m=+0.998665165 image pull 9af6aa52ee187025bc25565b66d3eefb486acac26f9281e33f4cce76a40d21f7 quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842
Dec  6 04:58:24 np0005548916 podman[209392]: 2025-12-06 09:58:24.571506537 +0000 UTC m=+0.199792007 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec  6 04:58:24 np0005548916 podman[209441]: 2025-12-06 09:58:24.700414928 +0000 UTC m=+0.060839167 container create ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd)
Dec  6 04:58:24 np0005548916 podman[209441]: 2025-12-06 09:58:24.669977485 +0000 UTC m=+0.030401744 image pull 9af6aa52ee187025bc25565b66d3eefb486acac26f9281e33f4cce76a40d21f7 quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842
Dec  6 04:58:24 np0005548916 python3[209339]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name multipathd --conmon-pidfile /run/multipathd.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=multipathd --label container_name=multipathd --label managed_by=edpm_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro --volume /dev:/dev --volume /run/udev:/run/udev --volume /sys:/sys --volume /lib/modules:/lib/modules:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /var/lib/openstack/healthchecks/multipathd:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842
Dec  6 04:58:24 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:24 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0004080 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:25 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:58:25 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:58:25 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:58:25.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:58:25 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:58:25 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:58:25 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:58:25.513 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:58:25 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:25 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0004080 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:25 np0005548916 systemd[1]: virtproxyd.service: Deactivated successfully.
Dec  6 04:58:25 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:25 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f0004300 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:26 np0005548916 python3.9[209632]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  6 04:58:26 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:26 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c001470 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:26 np0005548916 python3.9[209787]: ansible-file Invoked with path=/etc/systemd/system/edpm_multipathd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:58:27 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:58:27 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 04:58:27 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:58:27.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 04:58:27 np0005548916 python3.9[209863]: ansible-stat Invoked with path=/etc/systemd/system/edpm_multipathd_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  6 04:58:27 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:58:27 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:58:27 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:58:27.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:58:27 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:27 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0004080 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:27 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:27 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0004080 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:28 np0005548916 python3.9[210014]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765015107.5045774-1611-254324911939266/source dest=/etc/systemd/system/edpm_multipathd.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:58:28 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:58:28 np0005548916 python3.9[210091]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec  6 04:58:28 np0005548916 systemd[1]: Reloading.
Dec  6 04:58:28 np0005548916 podman[210092]: 2025-12-06 09:58:28.797164551 +0000 UTC m=+0.090611994 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Dec  6 04:58:28 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:28 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f0004320 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:28 np0005548916 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:58:28 np0005548916 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:58:29 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:58:29 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:58:29 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:58:29.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:58:29 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:58:29 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:58:29 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:58:29.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:58:29 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:29 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c001470 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:29 np0005548916 python3.9[210248]: ansible-systemd Invoked with state=restarted name=edpm_multipathd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  6 04:58:29 np0005548916 systemd[1]: Reloading.
Dec  6 04:58:29 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:29 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c001470 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:29 np0005548916 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:58:29 np0005548916 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:58:30 np0005548916 systemd[1]: Starting multipathd container...
Dec  6 04:58:30 np0005548916 systemd[1]: Started libcrun container.
Dec  6 04:58:30 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/730c45c7dd8dcda876f2bc17b3c61b25832c8554cbb3bcfa917da2d02fcaf626/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Dec  6 04:58:30 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/730c45c7dd8dcda876f2bc17b3c61b25832c8554cbb3bcfa917da2d02fcaf626/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Dec  6 04:58:30 np0005548916 systemd[1]: Started /usr/bin/podman healthcheck run ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59.
Dec  6 04:58:30 np0005548916 podman[210293]: 2025-12-06 09:58:30.413324753 +0000 UTC m=+0.144929330 container init ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd)
Dec  6 04:58:30 np0005548916 multipathd[210308]: + sudo -E kolla_set_configs
Dec  6 04:58:30 np0005548916 podman[210293]: 2025-12-06 09:58:30.442046434 +0000 UTC m=+0.173650971 container start ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd)
Dec  6 04:58:30 np0005548916 podman[210293]: multipathd
Dec  6 04:58:30 np0005548916 systemd[1]: Started multipathd container.
Dec  6 04:58:30 np0005548916 multipathd[210308]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec  6 04:58:30 np0005548916 multipathd[210308]: INFO:__main__:Validating config file
Dec  6 04:58:30 np0005548916 multipathd[210308]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec  6 04:58:30 np0005548916 multipathd[210308]: INFO:__main__:Writing out command to execute
Dec  6 04:58:30 np0005548916 podman[210314]: 2025-12-06 09:58:30.554331524 +0000 UTC m=+0.100692804 container health_status ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, container_name=multipathd, io.buildah.version=1.41.3, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Dec  6 04:58:30 np0005548916 multipathd[210308]: ++ cat /run_command
Dec  6 04:58:30 np0005548916 systemd[1]: ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59-ebceffeba97e509.service: Main process exited, code=exited, status=1/FAILURE
Dec  6 04:58:30 np0005548916 systemd[1]: ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59-ebceffeba97e509.service: Failed with result 'exit-code'.
Dec  6 04:58:30 np0005548916 multipathd[210308]: + CMD='/usr/sbin/multipathd -d'
Dec  6 04:58:30 np0005548916 multipathd[210308]: + ARGS=
Dec  6 04:58:30 np0005548916 multipathd[210308]: + sudo kolla_copy_cacerts
Dec  6 04:58:30 np0005548916 multipathd[210308]: + [[ ! -n '' ]]
Dec  6 04:58:30 np0005548916 multipathd[210308]: + . kolla_extend_start
Dec  6 04:58:30 np0005548916 multipathd[210308]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Dec  6 04:58:30 np0005548916 multipathd[210308]: Running command: '/usr/sbin/multipathd -d'
Dec  6 04:58:30 np0005548916 multipathd[210308]: + umask 0022
Dec  6 04:58:30 np0005548916 multipathd[210308]: + exec /usr/sbin/multipathd -d
Dec  6 04:58:30 np0005548916 multipathd[210308]: 3483.291667 | --------start up--------
Dec  6 04:58:30 np0005548916 multipathd[210308]: 3483.291874 | read /etc/multipath.conf
Dec  6 04:58:30 np0005548916 multipathd[210308]: 3483.300140 | path checkers start up
Dec  6 04:58:30 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:30 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c001470 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:31 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:58:31 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:58:31 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:58:31.357 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:58:31 np0005548916 python3.9[210495]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath/.multipath_restart_required follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  6 04:58:31 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:58:31 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:58:31 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:58:31.524 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:58:31 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:31 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f4002690 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:31 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:31 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c001470 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:32 np0005548916 python3.9[210649]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps --filter volume=/etc/multipath.conf --format {{.Names}} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:58:32 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:32 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2140037d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:33 np0005548916 python3.9[210815]: ansible-ansible.builtin.systemd Invoked with name=edpm_multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  6 04:58:33 np0005548916 systemd[1]: Stopping multipathd container...
Dec  6 04:58:33 np0005548916 multipathd[210308]: 3485.965419 | exit (signal)
Dec  6 04:58:33 np0005548916 multipathd[210308]: 3485.965533 | --------shut down-------
Dec  6 04:58:33 np0005548916 systemd[1]: libpod-ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59.scope: Deactivated successfully.
Dec  6 04:58:33 np0005548916 podman[210819]: 2025-12-06 09:58:33.321704337 +0000 UTC m=+0.090180394 container died ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125)
Dec  6 04:58:33 np0005548916 systemd[1]: ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59-ebceffeba97e509.timer: Deactivated successfully.
Dec  6 04:58:33 np0005548916 systemd[1]: Stopped /usr/bin/podman healthcheck run ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59.
Dec  6 04:58:33 np0005548916 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59-userdata-shm.mount: Deactivated successfully.
Dec  6 04:58:33 np0005548916 systemd[1]: var-lib-containers-storage-overlay-730c45c7dd8dcda876f2bc17b3c61b25832c8554cbb3bcfa917da2d02fcaf626-merged.mount: Deactivated successfully.
Dec  6 04:58:33 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:58:33 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:58:33 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:58:33.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:58:33 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:58:33 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:58:33 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:58:33 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:58:33.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:58:33 np0005548916 podman[210819]: 2025-12-06 09:58:33.560055637 +0000 UTC m=+0.328531724 container cleanup ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec  6 04:58:33 np0005548916 podman[210819]: multipathd
Dec  6 04:58:33 np0005548916 podman[210847]: multipathd
Dec  6 04:58:33 np0005548916 systemd[1]: edpm_multipathd.service: Deactivated successfully.
Dec  6 04:58:33 np0005548916 systemd[1]: Stopped multipathd container.
Dec  6 04:58:33 np0005548916 systemd[1]: Starting multipathd container...
Dec  6 04:58:33 np0005548916 systemd[1]: Started libcrun container.
Dec  6 04:58:33 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/730c45c7dd8dcda876f2bc17b3c61b25832c8554cbb3bcfa917da2d02fcaf626/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Dec  6 04:58:33 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/730c45c7dd8dcda876f2bc17b3c61b25832c8554cbb3bcfa917da2d02fcaf626/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Dec  6 04:58:33 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:33 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0004080 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:33 np0005548916 systemd[1]: Started /usr/bin/podman healthcheck run ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59.
Dec  6 04:58:33 np0005548916 podman[210861]: 2025-12-06 09:58:33.770531427 +0000 UTC m=+0.118590887 container init ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec  6 04:58:33 np0005548916 multipathd[210877]: + sudo -E kolla_set_configs
Dec  6 04:58:33 np0005548916 podman[210861]: 2025-12-06 09:58:33.799193617 +0000 UTC m=+0.147253057 container start ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec  6 04:58:33 np0005548916 podman[210861]: multipathd
Dec  6 04:58:33 np0005548916 systemd[1]: Started multipathd container.
Dec  6 04:58:33 np0005548916 multipathd[210877]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec  6 04:58:33 np0005548916 multipathd[210877]: INFO:__main__:Validating config file
Dec  6 04:58:33 np0005548916 multipathd[210877]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec  6 04:58:33 np0005548916 multipathd[210877]: INFO:__main__:Writing out command to execute
Dec  6 04:58:33 np0005548916 multipathd[210877]: ++ cat /run_command
Dec  6 04:58:33 np0005548916 multipathd[210877]: + CMD='/usr/sbin/multipathd -d'
Dec  6 04:58:33 np0005548916 multipathd[210877]: + ARGS=
Dec  6 04:58:33 np0005548916 multipathd[210877]: + sudo kolla_copy_cacerts
Dec  6 04:58:33 np0005548916 multipathd[210877]: + [[ ! -n '' ]]
Dec  6 04:58:33 np0005548916 multipathd[210877]: + . kolla_extend_start
Dec  6 04:58:33 np0005548916 multipathd[210877]: Running command: '/usr/sbin/multipathd -d'
Dec  6 04:58:33 np0005548916 multipathd[210877]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Dec  6 04:58:33 np0005548916 multipathd[210877]: + umask 0022
Dec  6 04:58:33 np0005548916 multipathd[210877]: + exec /usr/sbin/multipathd -d
Dec  6 04:58:33 np0005548916 podman[210884]: 2025-12-06 09:58:33.890024086 +0000 UTC m=+0.077315325 container health_status ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec  6 04:58:33 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:33 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f4002690 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:33 np0005548916 systemd[1]: ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59-4676a25146bf9417.service: Main process exited, code=exited, status=1/FAILURE
Dec  6 04:58:33 np0005548916 systemd[1]: ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59-4676a25146bf9417.service: Failed with result 'exit-code'.
Dec  6 04:58:33 np0005548916 multipathd[210877]: 3486.575341 | --------start up--------
Dec  6 04:58:33 np0005548916 multipathd[210877]: 3486.575366 | read /etc/multipath.conf
Dec  6 04:58:33 np0005548916 multipathd[210877]: 3486.582330 | path checkers start up
Dec  6 04:58:34 np0005548916 python3.9[211070]: ansible-ansible.builtin.file Invoked with path=/etc/multipath/.multipath_restart_required state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:58:34 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:34 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c001470 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:35 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:58:35 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 04:58:35 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:58:35.362 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 04:58:35 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:58:35 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:58:35 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:58:35.530 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:58:35 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:35 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2140037d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:35 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:35 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0004080 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:36 np0005548916 python3.9[211222]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Dec  6 04:58:36 np0005548916 systemd[1]: virtsecretd.service: Deactivated successfully.
Dec  6 04:58:36 np0005548916 systemd[1]: virtqemud.service: Deactivated successfully.
Dec  6 04:58:36 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:36 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f4002690 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:36 np0005548916 python3.9[211375]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Dec  6 04:58:36 np0005548916 kernel: Key type psk registered
Dec  6 04:58:37 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:58:37 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:58:37 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:58:37.364 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:58:37 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:58:37 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:58:37 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:58:37.534 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:58:37 np0005548916 python3.9[211540]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:58:37 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:37 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f4002690 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:37 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:37 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2140037d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:38 np0005548916 python3.9[211663]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765015117.184472-1851-204537384215914/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:58:38 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:58:38 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:38 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c001470 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:39 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:58:39 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 04:58:39 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:58:39.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 04:58:39 np0005548916 python3.9[211816]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:58:39 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:58:39 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:58:39 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:58:39.537 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:58:39 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:39 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c001470 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:39 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:39 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f4002690 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:40 np0005548916 ceph-mon[79770]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #34. Immutable memtables: 0.
Dec  6 04:58:40 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:58:40.199019) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec  6 04:58:40 np0005548916 ceph-mon[79770]: rocksdb: [db/flush_job.cc:856] [default] [JOB 17] Flushing memtable with next log file: 34
Dec  6 04:58:40 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015120199358, "job": 17, "event": "flush_started", "num_memtables": 1, "num_entries": 1328, "num_deletes": 256, "total_data_size": 3264946, "memory_usage": 3311728, "flush_reason": "Manual Compaction"}
Dec  6 04:58:40 np0005548916 ceph-mon[79770]: rocksdb: [db/flush_job.cc:885] [default] [JOB 17] Level-0 flush table #35: started
Dec  6 04:58:40 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015120217192, "cf_name": "default", "job": 17, "event": "table_file_creation", "file_number": 35, "file_size": 2138927, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 18148, "largest_seqno": 19471, "table_properties": {"data_size": 2133279, "index_size": 3039, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1541, "raw_key_size": 11449, "raw_average_key_size": 18, "raw_value_size": 2121946, "raw_average_value_size": 3455, "num_data_blocks": 137, "num_entries": 614, "num_filter_entries": 614, "num_deletions": 256, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765015004, "oldest_key_time": 1765015004, "file_creation_time": 1765015120, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79d3ff52-4bd2-4fdc-8b55-33dd9c53d8e0", "db_session_id": "1TK25AVRA1WQDS4JHM8T", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}}
Dec  6 04:58:40 np0005548916 ceph-mon[79770]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 17] Flush lasted 18446 microseconds, and 10038 cpu microseconds.
Dec  6 04:58:40 np0005548916 ceph-mon[79770]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  6 04:58:40 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:58:40.217492) [db/flush_job.cc:967] [default] [JOB 17] Level-0 flush table #35: 2138927 bytes OK
Dec  6 04:58:40 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:58:40.217626) [db/memtable_list.cc:519] [default] Level-0 commit table #35 started
Dec  6 04:58:40 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:58:40.220107) [db/memtable_list.cc:722] [default] Level-0 commit table #35: memtable #1 done
Dec  6 04:58:40 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:58:40.220257) EVENT_LOG_v1 {"time_micros": 1765015120220231, "job": 17, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec  6 04:58:40 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:58:40.220325) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec  6 04:58:40 np0005548916 ceph-mon[79770]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 17] Try to delete WAL files size 3258646, prev total WAL file size 3258646, number of live WAL files 2.
Dec  6 04:58:40 np0005548916 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000031.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 04:58:40 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:58:40.222917) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0030' seq:72057594037927935, type:22 .. '6C6F676D00323533' seq:0, type:0; will stop at (end)
Dec  6 04:58:40 np0005548916 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 18] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec  6 04:58:40 np0005548916 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 17 Base level 0, inputs: [35(2088KB)], [33(11MB)]
Dec  6 04:58:40 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015120223064, "job": 18, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [35], "files_L6": [33], "score": -1, "input_data_size": 14473186, "oldest_snapshot_seqno": -1}
Dec  6 04:58:40 np0005548916 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 18] Generated table #36: 4958 keys, 13987457 bytes, temperature: kUnknown
Dec  6 04:58:40 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015120338652, "cf_name": "default", "job": 18, "event": "table_file_creation", "file_number": 36, "file_size": 13987457, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13952617, "index_size": 21354, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 12421, "raw_key_size": 126161, "raw_average_key_size": 25, "raw_value_size": 13860895, "raw_average_value_size": 2795, "num_data_blocks": 876, "num_entries": 4958, "num_filter_entries": 4958, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765014012, "oldest_key_time": 0, "file_creation_time": 1765015120, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79d3ff52-4bd2-4fdc-8b55-33dd9c53d8e0", "db_session_id": "1TK25AVRA1WQDS4JHM8T", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}}
Dec  6 04:58:40 np0005548916 ceph-mon[79770]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  6 04:58:40 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:58:40.339049) [db/compaction/compaction_job.cc:1663] [default] [JOB 18] Compacted 1@0 + 1@6 files to L6 => 13987457 bytes
Dec  6 04:58:40 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:58:40.340389) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 125.1 rd, 120.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.0, 11.8 +0.0 blob) out(13.3 +0.0 blob), read-write-amplify(13.3) write-amplify(6.5) OK, records in: 5484, records dropped: 526 output_compression: NoCompression
Dec  6 04:58:40 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:58:40.340409) EVENT_LOG_v1 {"time_micros": 1765015120340398, "job": 18, "event": "compaction_finished", "compaction_time_micros": 115731, "compaction_time_cpu_micros": 35985, "output_level": 6, "num_output_files": 1, "total_output_size": 13987457, "num_input_records": 5484, "num_output_records": 4958, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec  6 04:58:40 np0005548916 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000035.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 04:58:40 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015120340867, "job": 18, "event": "table_file_deletion", "file_number": 35}
Dec  6 04:58:40 np0005548916 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000033.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 04:58:40 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015120343107, "job": 18, "event": "table_file_deletion", "file_number": 33}
Dec  6 04:58:40 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:58:40.222789) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 04:58:40 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:58:40.343305) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 04:58:40 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:58:40.343314) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 04:58:40 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:58:40.343316) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 04:58:40 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:58:40.343318) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 04:58:40 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:58:40.343320) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 04:58:40 np0005548916 python3.9[211969]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  6 04:58:40 np0005548916 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Dec  6 04:58:40 np0005548916 systemd[1]: Stopped Load Kernel Modules.
Dec  6 04:58:40 np0005548916 systemd[1]: Stopping Load Kernel Modules...
Dec  6 04:58:40 np0005548916 systemd[1]: Starting Load Kernel Modules...
Dec  6 04:58:40 np0005548916 systemd[1]: Finished Load Kernel Modules.
Dec  6 04:58:40 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:40 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2140037d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:41 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:58:41 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:58:41 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:58:41.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:58:41 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:58:41 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:58:41 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:58:41.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:58:41 np0005548916 python3.9[212125]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  6 04:58:41 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:41 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c001470 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:41 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:41 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0004080 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:42 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:42 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0004080 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:43 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:58:43 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 04:58:43 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:58:43.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 04:58:43 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:58:43 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:58:43 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 04:58:43 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:58:43.545 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 04:58:43 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:43 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2140037d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:43 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:43 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c001470 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:44 np0005548916 systemd[1]: Reloading.
Dec  6 04:58:44 np0005548916 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:58:44 np0005548916 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:58:44 np0005548916 systemd[1]: Reloading.
Dec  6 04:58:44 np0005548916 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:58:44 np0005548916 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:58:44 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:44 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0004080 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:44 np0005548916 systemd-logind[788]: Watching system buttons on /dev/input/event0 (Power Button)
Dec  6 04:58:44 np0005548916 systemd-logind[788]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Dec  6 04:58:45 np0005548916 lvm[212240]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec  6 04:58:45 np0005548916 lvm[212240]: VG ceph_vg0 finished
Dec  6 04:58:45 np0005548916 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec  6 04:58:45 np0005548916 systemd[1]: Starting man-db-cache-update.service...
Dec  6 04:58:45 np0005548916 systemd[1]: Reloading.
Dec  6 04:58:45 np0005548916 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:58:45 np0005548916 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:58:45 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:58:45 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:58:45 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:58:45.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:58:45 np0005548916 systemd[1]: Queuing reload/restart jobs for marked units…
Dec  6 04:58:45 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:58:45 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:58:45 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:58:45.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:58:45 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:45 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f4003700 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:45 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:45 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2140037d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:46 np0005548916 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec  6 04:58:46 np0005548916 systemd[1]: Finished man-db-cache-update.service.
Dec  6 04:58:46 np0005548916 systemd[1]: man-db-cache-update.service: Consumed 1.774s CPU time.
Dec  6 04:58:46 np0005548916 systemd[1]: run-ra10d1b80ac27411e8f7c82adae0528ec.service: Deactivated successfully.
Dec  6 04:58:46 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:46 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2140037d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:47 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:58:47 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:58:47 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:58:47.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:58:47 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:58:47 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:58:47 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:58:47.552 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:58:47 np0005548916 python3.9[213584]: ansible-ansible.builtin.systemd_service Invoked with name=iscsid state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  6 04:58:47 np0005548916 systemd[1]: Stopping Open-iSCSI...
Dec  6 04:58:47 np0005548916 iscsid[201683]: iscsid shutting down.
Dec  6 04:58:47 np0005548916 systemd[1]: iscsid.service: Deactivated successfully.
Dec  6 04:58:47 np0005548916 systemd[1]: Stopped Open-iSCSI.
Dec  6 04:58:47 np0005548916 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Dec  6 04:58:47 np0005548916 systemd[1]: Starting Open-iSCSI...
Dec  6 04:58:47 np0005548916 systemd[1]: Started Open-iSCSI.
Dec  6 04:58:47 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:47 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0004080 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:47 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:47 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f4003700 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:48 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:58:48 np0005548916 python3.9[213739]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  6 04:58:48 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:48 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2140037d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:49 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:58:49 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:58:49 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:58:49.378 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:58:49 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:58:49 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:58:49 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:58:49.554 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:58:49 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:49 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c001470 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:49 np0005548916 python3.9[213920]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:58:49 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:49 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0004080 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:50 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:50 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f4003700 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:50 np0005548916 python3.9[214073]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec  6 04:58:51 np0005548916 systemd[1]: Reloading.
Dec  6 04:58:51 np0005548916 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:58:51 np0005548916 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:58:51 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:58:51 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:58:51 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:58:51.380 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:58:51 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:58:51 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:58:51 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:58:51.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:58:51 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:51 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2140037d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:51 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:51 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c001470 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:52 np0005548916 python3.9[214257]: ansible-ansible.builtin.service_facts Invoked
Dec  6 04:58:52 np0005548916 network[214274]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec  6 04:58:52 np0005548916 network[214275]: 'network-scripts' will be removed from distribution in near future.
Dec  6 04:58:52 np0005548916 network[214276]: It is advised to switch to 'NetworkManager' instead for network management.
Dec  6 04:58:52 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:52 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0004080 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:53 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:58:53 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:58:53 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:58:53.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:58:53 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:58:53 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:58:53 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:58:53 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:58:53.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:58:53 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:53 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f4003700 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:53 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:53 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2140037d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:58:54.268 141446 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 04:58:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:58:54.270 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 04:58:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:58:54.270 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 04:58:54 np0005548916 podman[214360]: 2025-12-06 09:58:54.826847351 +0000 UTC m=+0.120447643 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller)
Dec  6 04:58:54 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:54 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c001470 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:55 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:58:55 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:58:55 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:58:55.384 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:58:55 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:58:55 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:58:55 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:58:55.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:58:55 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:55 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0004080 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:55 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:55 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f4003700 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:56 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:56 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2140037d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:57 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:58:57 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:58:57 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:58:57.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:58:57 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:58:57 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:58:57 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:58:57.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:58:57 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:57 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c001470 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:57 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:57 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0004080 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:58 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:58:58 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:58 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f4003700 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:59 np0005548916 python3.9[214581]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  6 04:58:59 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:58:59 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:58:59 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:58:59.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:58:59 np0005548916 podman[214583]: 2025-12-06 09:58:59.504041864 +0000 UTC m=+0.067571503 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Dec  6 04:58:59 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:58:59 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:58:59 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:58:59.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:58:59 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:59 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2140037d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:59 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:59 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c001470 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:00 np0005548916 python3.9[214753]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  6 04:59:00 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:00 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0004080 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:00 np0005548916 python3.9[214907]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  6 04:59:01 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:59:01 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:59:01 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:59:01.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:59:01 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:59:01 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 04:59:01 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:59:01.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 04:59:01 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:01 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f4003700 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:01 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:01 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f4003700 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:02 np0005548916 python3.9[215062]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  6 04:59:02 np0005548916 python3.9[215216]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  6 04:59:02 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:02 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c001470 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:03 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:59:03 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:59:03 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:59:03.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:59:03 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:59:03 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:59:03 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:59:03 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:59:03.578 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:59:03 np0005548916 python3.9[215369]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  6 04:59:03 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:03 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0004080 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:03 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:03 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2180012b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:04 np0005548916 podman[215494]: 2025-12-06 09:59:04.183104595 +0000 UTC m=+0.077848588 container health_status ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  6 04:59:04 np0005548916 python3.9[215537]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  6 04:59:04 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:04 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f4004410 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:05 np0005548916 python3.9[215696]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  6 04:59:05 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:59:05 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 04:59:05 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:59:05.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 04:59:05 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:59:05 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:59:05 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:59:05.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:59:05 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:05 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c001470 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:05 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:05 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0004080 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:06 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:59:06 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:59:06 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:59:06 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:59:06 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  6 04:59:06 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:59:06 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:59:06 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  6 04:59:06 np0005548916 python3.9[216002]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:59:06 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:06 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb218008be0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:07 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:59:07 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:59:07 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:59:07.397 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:59:07 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:59:07 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:59:07 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:59:07.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:59:07 np0005548916 python3.9[216154]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:59:07 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:07 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f4004410 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:07 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:07 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c001470 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:08 np0005548916 python3.9[216306]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:59:08 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:59:08 np0005548916 python3.9[216459]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:59:08 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:08 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0004080 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:09 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:59:09 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:59:09 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:59:09.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:59:09 np0005548916 python3.9[216611]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:59:09 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:59:09 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:59:09 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:59:09.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:59:09 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:09 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb218008be0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:09 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:09 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f4004410 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:10 np0005548916 python3.9[216788]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:59:10 np0005548916 python3.9[216941]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:59:10 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:10 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c001470 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:11 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:59:11 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:59:11 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:59:11.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:59:11 np0005548916 python3.9[217093]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:59:11 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:59:11 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:59:11 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:59:11.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:59:11 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:11 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0004080 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:11 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:11 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0004080 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:12 np0005548916 python3.9[217271]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:59:12 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:12 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f4004410 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:13 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:59:13 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:59:13 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:59:13 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:59:13 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:59:13.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:59:13 np0005548916 python3.9[217423]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:59:13 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:59:13 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:59:13 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:59:13 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:59:13.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:59:13 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:13 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c001470 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:13 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:13 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb218008be0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:14 np0005548916 python3.9[217575]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:59:14 np0005548916 python3.9[217728]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:59:14 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:14 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0004080 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:15 np0005548916 python3.9[217880]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:59:15 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:59:15 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 04:59:15 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:59:15.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 04:59:15 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:59:15 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:59:15 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:59:15.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:59:15 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:15 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f4004410 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:15 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:15 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c001470 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:15 np0005548916 python3.9[218032]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:59:16 np0005548916 python3.9[218185]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:59:16 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:16 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb218008be0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:17 np0005548916 python3.9[218337]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:59:17 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:59:17 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:59:17 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:59:17.409 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:59:17 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:59:17 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:59:17 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:59:17.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:59:17 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:17 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e00049a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:17 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:17 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f4004410 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:18 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:59:18 np0005548916 python3.9[218490]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012  systemctl disable --now certmonger.service#012  test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:59:18 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:18 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c001470 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:19 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:59:19 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 04:59:19 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:59:19.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 04:59:19 np0005548916 python3.9[218642]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec  6 04:59:19 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:59:19 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:59:19 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:59:19.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:59:19 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:19 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb218008be0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:19 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:19 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e00049a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:20 np0005548916 python3.9[218795]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec  6 04:59:20 np0005548916 systemd[1]: Reloading.
Dec  6 04:59:20 np0005548916 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:59:20 np0005548916 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:59:20 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:20 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f4004410 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:21 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:59:21 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:59:21 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:59:21.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:59:21 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:59:21 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:59:21 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:59:21.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:59:21 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:21 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c001470 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:21 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:21 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb218008be0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:21 np0005548916 python3.9[218982]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:59:22 np0005548916 python3.9[219136]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:59:22 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:22 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e00049a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:23 np0005548916 python3.9[219289]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:59:23 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:59:23 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:59:23 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:59:23.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:59:23 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:59:23 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:59:23 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:59:23 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:59:23.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:59:23 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:23 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f4004410 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:23 np0005548916 python3.9[219442]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:59:23 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:23 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c001470 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:24 np0005548916 python3.9[219596]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:59:24 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:24 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb218008be0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:25 np0005548916 podman[219721]: 2025-12-06 09:59:25.154378875 +0000 UTC m=+0.095846784 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec  6 04:59:25 np0005548916 python3.9[219766]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:59:25 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:59:25 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:59:25 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:59:25.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:59:25 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:59:25 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:59:25 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:59:25.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:59:25 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:25 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e00049a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:25 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:25 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f4004410 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:25 np0005548916 python3.9[219929]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:59:26 np0005548916 python3.9[220083]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:59:26 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:26 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c001470 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:27 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:59:27 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:59:27 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:59:27.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:59:27 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:59:27 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:59:27 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:59:27.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:59:27 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:27 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb218008be0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:27 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:27 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e00049a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:28 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:59:28 np0005548916 python3.9[220237]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:59:28 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:28 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f4004410 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:29 np0005548916 python3.9[220389]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:59:29 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:59:29 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:59:29 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:59:29.423 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:59:29 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:59:29 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:59:29 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:59:29.618 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:59:29 np0005548916 podman[220536]: 2025-12-06 09:59:29.748441219 +0000 UTC m=+0.053627308 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec  6 04:59:29 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:29 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f4004410 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:29 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:29 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb218008be0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:29 np0005548916 python3.9[220583]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:59:30 np0005548916 python3.9[220736]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:59:30 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:30 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e00049a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:31 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:59:31 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:59:31 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:59:31.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:59:31 np0005548916 python3.9[220889]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:59:31 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:59:31 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:59:31 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:59:31.621 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:59:31 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:31 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c001470 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:31 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:31 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb214002ff0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:32 np0005548916 python3.9[221041]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:59:32 np0005548916 python3.9[221195]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:59:32 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:32 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb218008be0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:33 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:59:33 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:59:33 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:59:33.427 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:59:33 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:59:33 np0005548916 python3.9[221347]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:59:33 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:59:33 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:59:33 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:59:33.624 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:59:33 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:33 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f0002f10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:33 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:33 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e00049a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:34 np0005548916 python3.9[221499]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:59:34 np0005548916 podman[221501]: 2025-12-06 09:59:34.34917164 +0000 UTC m=+0.059262868 container health_status ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd)
Dec  6 04:59:34 np0005548916 python3.9[221672]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:59:34 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:34 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb214002ff0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:35 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:59:35 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:59:35 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:59:35.430 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:59:35 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:59:35 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:59:35 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:59:35.627 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:59:35 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:35 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb218008be0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:35 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:35 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f0002f10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:35 np0005548916 ceph-osd[77465]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec  6 04:59:35 np0005548916 ceph-osd[77465]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1200.1 total, 600.0 interval#012Cumulative writes: 7804 writes, 31K keys, 7804 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.02 MB/s#012Cumulative WAL: 7804 writes, 1639 syncs, 4.76 writes per sync, written: 0.02 GB, 0.02 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 625 writes, 1051 keys, 625 commit groups, 1.0 writes per commit group, ingest: 0.41 MB, 0.00 MB/s#012Interval WAL: 625 writes, 306 syncs, 2.04 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.01              0.00         1    0.013       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.01              0.00         1    0.013       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.1      0.01              0.00         1    0.013       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55fb227cf350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.8e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55fb227cf350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.8e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable
Dec  6 04:59:36 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:36 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e00049a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:37 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:59:37 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:59:37 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:59:37.432 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:59:37 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:59:37 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:59:37 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:59:37.630 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:59:37 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:37 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb214002ff0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:37 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:37 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb218008be0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:38 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:59:38 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:38 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f0002f10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:39 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:59:39 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:59:39 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:59:39.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:59:39 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:59:39 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:59:39 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:59:39.634 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:59:39 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:39 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e00049a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:39 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:39 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb214002ff0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:40 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:40 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb218008be0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:40 np0005548916 python3.9[221827]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Dec  6 04:59:41 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:59:41 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:59:41 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:59:41.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:59:41 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:59:41 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 04:59:41 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:59:41.638 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 04:59:41 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:41 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f0002f10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:41 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:41 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e00049a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:42 np0005548916 python3.9[221980]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec  6 04:59:42 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:42 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb214002ff0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:43 np0005548916 python3.9[222139]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Dec  6 04:59:43 np0005548916 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec  6 04:59:43 np0005548916 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec  6 04:59:43 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:59:43 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 04:59:43 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:59:43.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 04:59:43 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:59:43 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:59:43 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:59:43 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:59:43.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:59:43 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:43 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb218008be0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:43 np0005548916 kernel: ganesha.nfsd[221042]: segfault at 50 ip 00007fb2c8a7832e sp 00007fb28d7f9210 error 4 in libntirpc.so.5.8[7fb2c8a5d000+2c000] likely on CPU 7 (core 0, socket 7)
Dec  6 04:59:43 np0005548916 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Dec  6 04:59:43 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:43 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f0002f10 fd 39 proxy ignored for local
Dec  6 04:59:43 np0005548916 systemd[1]: Started Process Core Dump (PID 222173/UID 0).
Dec  6 04:59:44 np0005548916 systemd-logind[788]: New session 54 of user zuul.
Dec  6 04:59:44 np0005548916 systemd[1]: Started Session 54 of User zuul.
Dec  6 04:59:44 np0005548916 systemd-logind[788]: Session 54 logged out. Waiting for processes to exit.
Dec  6 04:59:44 np0005548916 systemd[1]: session-54.scope: Deactivated successfully.
Dec  6 04:59:44 np0005548916 systemd-logind[788]: Removed session 54.
Dec  6 04:59:45 np0005548916 python3.9[222329]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:59:45 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:59:45 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 04:59:45 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:59:45.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 04:59:45 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:59:45 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:59:45 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:59:45.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:59:45 np0005548916 systemd-coredump[222174]: Process 166453 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 77:#012#0  0x00007fb2c8a7832e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Dec  6 04:59:45 np0005548916 systemd[1]: systemd-coredump@6-222173-0.service: Deactivated successfully.
Dec  6 04:59:45 np0005548916 systemd[1]: systemd-coredump@6-222173-0.service: Consumed 1.800s CPU time.
Dec  6 04:59:45 np0005548916 podman[222412]: 2025-12-06 09:59:45.946222368 +0000 UTC m=+0.040767540 container died 59c3a18112ee7376f7e084c537acf33fec4744253b3178b4083465a9740dedf8 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, ceph=True, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0)
Dec  6 04:59:45 np0005548916 systemd[1]: var-lib-containers-storage-overlay-674595a2f8d871ddef4522155fda703c933fe31e7b86dbc4d96e00021066cf79-merged.mount: Deactivated successfully.
Dec  6 04:59:46 np0005548916 podman[222412]: 2025-12-06 09:59:46.006044429 +0000 UTC m=+0.100589581 container remove 59c3a18112ee7376f7e084c537acf33fec4744253b3178b4083465a9740dedf8 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, ceph=True, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  6 04:59:46 np0005548916 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.0.0.compute-1.djsnbu.service: Main process exited, code=exited, status=139/n/a
Dec  6 04:59:46 np0005548916 python3.9[222467]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765015184.784028-3434-215211136421978/.source.json follow=False _original_basename=config.json.j2 checksum=b51012bfb0ca26296dcf3793a2f284446fb1395e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:59:46 np0005548916 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.0.0.compute-1.djsnbu.service: Failed with result 'exit-code'.
Dec  6 04:59:46 np0005548916 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.0.0.compute-1.djsnbu.service: Consumed 2.578s CPU time.
Dec  6 04:59:46 np0005548916 python3.9[222646]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:59:47 np0005548916 python3.9[222722]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:59:47 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:59:47 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:59:47 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:59:47.442 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:59:47 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:59:47 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:59:47 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:59:47.649 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:59:48 np0005548916 python3.9[222872]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:59:48 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:59:48 np0005548916 python3.9[222994]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765015187.5572815-3434-81769474876170/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:59:49 np0005548916 python3.9[223144]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:59:49 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:59:49 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:59:49 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:59:49.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:59:49 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:59:49 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:59:49 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:59:49.652 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:59:49 np0005548916 python3.9[223265]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765015188.880816-3434-61046852501847/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=bc7f3bb7d4094c596a18178a888511b54e157ba4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:59:49 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/095949 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  6 04:59:50 np0005548916 python3.9[223441]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:59:50 np0005548916 python3.9[223562]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765015190.0367112-3434-130095729765037/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:59:51 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:59:51 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:59:51 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:59:51.446 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:59:51 np0005548916 python3.9[223712]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:59:51 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:59:51 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:59:51 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:59:51.655 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:59:52 np0005548916 python3.9[223833]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765015191.130659-3434-40697051335534/.source follow=False _original_basename=run-on-host checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:59:53 np0005548916 python3.9[223986]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:59:53 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:59:53 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:59:53 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:59:53.448 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:59:53 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:59:53 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:59:53 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:59:53 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:59:53.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:59:53 np0005548916 python3.9[224138]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:59:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:59:54.270 141446 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 04:59:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:59:54.272 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 04:59:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 09:59:54.272 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 04:59:54 np0005548916 python3.9[224291]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  6 04:59:55 np0005548916 podman[224443]: 2025-12-06 09:59:55.37023775 +0000 UTC m=+0.138189412 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  6 04:59:55 np0005548916 python3.9[224444]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:59:55 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:59:55 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:59:55 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:59:55.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:59:55 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:59:55 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:59:55 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:59:55.661 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:59:55 np0005548916 python3.9[224592]: ansible-ansible.legacy.copy Invoked with attributes=+i dest=/var/lib/nova/compute_id group=nova mode=0400 owner=nova src=/home/zuul/.ansible/tmp/ansible-tmp-1765015194.9025126-3756-143507184368725/.source _original_basename=.u2lyrnjp follow=False checksum=1b13389afdbc18c3b0e4207972a5a874c4fd04bf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None
Dec  6 04:59:56 np0005548916 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.0.0.compute-1.djsnbu.service: Scheduled restart job, restart counter is at 7.
Dec  6 04:59:56 np0005548916 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.djsnbu for 5ecd3f74-dade-5fc4-92ce-8950ae424258.
Dec  6 04:59:56 np0005548916 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.0.0.compute-1.djsnbu.service: Consumed 2.578s CPU time.
Dec  6 04:59:56 np0005548916 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.djsnbu for 5ecd3f74-dade-5fc4-92ce-8950ae424258...
Dec  6 04:59:56 np0005548916 podman[224711]: 2025-12-06 09:59:56.660122724 +0000 UTC m=+0.046867562 container create 6dc139c09dbc99a313d5333e87cc0ba0df15ffda5b12614866d45ea226e1d6ce (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, ceph=True, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Dec  6 04:59:56 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8c79bfeb25e587d3943a06906c158dd3f62a52f59079e06c39c4ba774c28c036/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Dec  6 04:59:56 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8c79bfeb25e587d3943a06906c158dd3f62a52f59079e06c39c4ba774c28c036/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  6 04:59:56 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8c79bfeb25e587d3943a06906c158dd3f62a52f59079e06c39c4ba774c28c036/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Dec  6 04:59:56 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8c79bfeb25e587d3943a06906c158dd3f62a52f59079e06c39c4ba774c28c036/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.djsnbu-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Dec  6 04:59:56 np0005548916 podman[224711]: 2025-12-06 09:59:56.732124976 +0000 UTC m=+0.118869834 container init 6dc139c09dbc99a313d5333e87cc0ba0df15ffda5b12614866d45ea226e1d6ce (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, ceph=True, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0)
Dec  6 04:59:56 np0005548916 podman[224711]: 2025-12-06 09:59:56.640845036 +0000 UTC m=+0.027589904 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  6 04:59:56 np0005548916 podman[224711]: 2025-12-06 09:59:56.737847238 +0000 UTC m=+0.124592076 container start 6dc139c09dbc99a313d5333e87cc0ba0df15ffda5b12614866d45ea226e1d6ce (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, org.label-schema.vendor=CentOS, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, OSD_FLAVOR=default, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  6 04:59:56 np0005548916 bash[224711]: 6dc139c09dbc99a313d5333e87cc0ba0df15ffda5b12614866d45ea226e1d6ce
Dec  6 04:59:56 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 09:59:56 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Dec  6 04:59:56 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 09:59:56 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Dec  6 04:59:56 np0005548916 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.djsnbu for 5ecd3f74-dade-5fc4-92ce-8950ae424258.
Dec  6 04:59:56 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 09:59:56 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Dec  6 04:59:56 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 09:59:56 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Dec  6 04:59:56 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 09:59:56 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Dec  6 04:59:56 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 09:59:56 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Dec  6 04:59:56 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 09:59:56 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Dec  6 04:59:56 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 09:59:56 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 04:59:56 np0005548916 python3.9[224826]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  6 04:59:57 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:59:57 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:59:57 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:59:57.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:59:57 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:59:57 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:59:57 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:59:57.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:59:57 np0005548916 python3.9[225000]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:59:58 np0005548916 python3.9[225121]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765015197.3208287-3833-205626550502035/.source.json follow=False _original_basename=nova_compute.json.j2 checksum=81f1f28d070b2613355f782b83a5777fdba9540e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:59:58 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:59:59 np0005548916 python3.9[225272]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:59:59 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:59:59 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:59:59 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:59:59.455 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:59:59 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 04:59:59 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:59:59 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:59:59.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:59:59 np0005548916 python3.9[225393]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute_init.json mode=0700 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765015198.7398415-3878-200401887995763/.source.json follow=False _original_basename=nova_compute_init.json.j2 checksum=2efe6ae78bce1c26d2c384be079fa366810076ad backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  6 05:00:00 np0005548916 ceph-mon[79770]: overall HEALTH_OK
Dec  6 05:00:00 np0005548916 podman[225518]: 2025-12-06 10:00:00.770509755 +0000 UTC m=+0.086847142 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec  6 05:00:00 np0005548916 python3.9[225562]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False
Dec  6 05:00:01 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:00:01 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:00:01 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:00:01.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:00:01 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:00:01 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:00:01 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:00:01.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:00:01 np0005548916 python3.9[225717]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec  6 05:00:02 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:02 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:00:02 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:02 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:00:02 np0005548916 python3[225870]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json log_base_path=/var/log/containers/stdouts debug=False
Dec  6 05:00:03 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:00:03 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:00:03 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:00:03.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:00:03 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:00:03 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:00:03 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:00:03 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:00:03.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:00:04 np0005548916 podman[225908]: 2025-12-06 10:00:04.764858913 +0000 UTC m=+0.067270727 container health_status ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Dec  6 05:00:05 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:00:05 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:00:05 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:00:05.463 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:00:05 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:00:05 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:00:05 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:00:05.678 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:00:07 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:00:07 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:00:07 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:00:07.465 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:00:07 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:00:07 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:00:07 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:00:07.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:00:08 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:00:09 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:09 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  6 05:00:09 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:09 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Dec  6 05:00:09 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:09 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Dec  6 05:00:09 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:09 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Dec  6 05:00:09 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:09 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Dec  6 05:00:09 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:09 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Dec  6 05:00:09 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:09 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Dec  6 05:00:09 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:09 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  6 05:00:09 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:09 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  6 05:00:09 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:09 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  6 05:00:09 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:09 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Dec  6 05:00:09 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:09 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  6 05:00:09 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:09 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Dec  6 05:00:09 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:09 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Dec  6 05:00:09 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:09 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Dec  6 05:00:09 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:09 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Dec  6 05:00:09 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:09 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Dec  6 05:00:09 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:09 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Dec  6 05:00:09 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:09 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Dec  6 05:00:09 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:09 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Dec  6 05:00:09 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:09 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Dec  6 05:00:09 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:09 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Dec  6 05:00:09 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:09 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Dec  6 05:00:09 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:09 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Dec  6 05:00:09 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:09 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec  6 05:00:09 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:09 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Dec  6 05:00:09 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:09 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec  6 05:00:09 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:00:09 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:00:09 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:00:09.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:00:09 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:00:09 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:00:09 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:00:09.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:00:09 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:09 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4e0000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:00:09 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:09 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4d4001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:00:10 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:10 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4bc000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:00:11 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:00:11 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:00:11 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:00:11.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:00:11 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:00:11 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:00:11 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:00:11.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:00:11 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:11 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4bc000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:00:11 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/100011 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  6 05:00:11 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:11 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4b8000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:00:12 np0005548916 ceph-mon[79770]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec  6 05:00:12 np0005548916 ceph-mon[79770]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1200.0 total, 600.0 interval#012Cumulative writes: 3608 writes, 20K keys, 3608 commit groups, 1.0 writes per commit group, ingest: 0.05 GB, 0.05 MB/s#012Cumulative WAL: 3607 writes, 3607 syncs, 1.00 writes per sync, written: 0.05 GB, 0.05 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1396 writes, 6521 keys, 1396 commit groups, 1.0 writes per commit group, ingest: 16.14 MB, 0.03 MB/s#012Interval WAL: 1395 writes, 1395 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0    111.7      0.26              0.12         9    0.029       0      0       0.0       0.0#012  L6      1/0   13.34 MB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   3.6    115.1    100.1      1.04              0.39         8    0.130     38K   4138       0.0       0.0#012 Sum      1/0   13.34 MB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   4.6     92.1    102.4      1.30              0.51        17    0.076     38K   4138       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   6.6    112.5    112.4      0.40              0.14         6    0.067     16K   1857       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   0.0    115.1    100.1      1.04              0.39         8    0.130     38K   4138       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0    112.5      0.26              0.12         8    0.032       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.0 total, 600.0 interval#012Flush(GB): cumulative 0.028, interval 0.007#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.13 GB write, 0.11 MB/s write, 0.12 GB read, 0.10 MB/s read, 1.3 seconds#012Interval compaction: 0.04 GB write, 0.08 MB/s write, 0.04 GB read, 0.08 MB/s read, 0.4 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55fbbecff350#2 capacity: 304.00 MB usage: 5.00 MB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 0 last_secs: 0.000146 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(268,4.67 MB,1.53495%) FilterBlock(17,116.36 KB,0.037379%) IndexBlock(17,221.86 KB,0.0712696%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Dec  6 05:00:13 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:12 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4b8000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:00:13 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:00:13 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:00:13 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:00:13.471 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:00:13 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:00:13 np0005548916 podman[225883]: 2025-12-06 10:00:13.579071167 +0000 UTC m=+10.636459758 image pull 5571c1b2140c835f70406e4553b3b44135b9c9b4eb673345cbd571460c5d59a3 quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5
Dec  6 05:00:13 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:00:13 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:00:13 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:00:13.689 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:00:13 np0005548916 podman[226111]: 2025-12-06 10:00:13.750799868 +0000 UTC m=+0.052502231 container create 5870135d94bf035757c1e7c1605be69377e8a248cd872b4fc5e7fa9268e794c2 (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5, name=nova_compute_init, config_id=edpm, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, container_name=nova_compute_init, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  6 05:00:13 np0005548916 podman[226111]: 2025-12-06 10:00:13.723337738 +0000 UTC m=+0.025040121 image pull 5571c1b2140c835f70406e4553b3b44135b9c9b4eb673345cbd571460c5d59a3 quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5
Dec  6 05:00:13 np0005548916 python3[225870]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --label config_id=edpm --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5 bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init
Dec  6 05:00:13 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:13 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4c4001140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:00:13 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:13 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4bc001b40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:00:15 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:15 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4d4001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:00:15 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:00:15 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:00:15 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:00:15.473 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:00:15 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:00:15 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:00:15 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:00:15.692 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:00:15 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:15 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4b8001b40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:00:15 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:15 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4c4001c60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:00:16 np0005548916 ceph-mds[84241]: mds.beacon.cephfs.compute-1.fpvjgb missed beacon ack from the monitors
Dec  6 05:00:17 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:17 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4bc001b40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:00:17 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:00:17 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:00:17 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:00:17.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:00:17 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:00:17 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:00:17 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:00:17.695 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:00:17 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:17 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4d4001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:00:17 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:17 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4b8001b40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:00:18 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:00:18 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:00:18 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:00:18 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  6 05:00:18 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:00:18 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:00:18 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  6 05:00:18 np0005548916 python3.9[226315]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  6 05:00:19 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:19 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4c4001c60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:00:19 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:00:19 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:00:19 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:00:19.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:00:19 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:00:19 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:00:19 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:00:19.699 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:00:19 np0005548916 python3.9[226469]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False
Dec  6 05:00:19 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:19 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4bc001b40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:00:19 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:19 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4d4001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:00:20 np0005548916 python3.9[226622]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec  6 05:00:21 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:21 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4b8001b40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:00:21 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:00:21 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:00:21 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:00:21.479 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:00:21 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:00:21 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:00:21 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:00:21.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:00:21 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:21 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4c4001c60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:00:21 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:21 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4bc002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:00:22 np0005548916 python3[226774]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json log_base_path=/var/log/containers/stdouts debug=False
Dec  6 05:00:22 np0005548916 podman[226809]: 2025-12-06 10:00:22.395790383 +0000 UTC m=+0.051677981 container create fb9b5b61d2ca2446a0d64f5cf1f50538f7d4513378afdaeab81ced8fe95510ba (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5, name=nova_compute, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Dec  6 05:00:22 np0005548916 podman[226809]: 2025-12-06 10:00:22.368652151 +0000 UTC m=+0.024539769 image pull 5571c1b2140c835f70406e4553b3b44135b9c9b4eb673345cbd571460c5d59a3 quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5
Dec  6 05:00:22 np0005548916 python3[226774]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --label config_id=edpm --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --pid host --privileged=True --user nova --volume /var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5 kolla_start
Dec  6 05:00:23 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:23 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4d4001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:00:23 np0005548916 python3.9[226997]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  6 05:00:23 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:00:23 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:00:23 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:00:23.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:00:23 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:00:23 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:00:23 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:00:23 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:00:23.706 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:00:23 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:23 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4b8002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:00:23 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:23 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4c40030f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:00:24 np0005548916 python3.9[227151]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 05:00:24 np0005548916 python3.9[227328]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765015224.1947293-4154-108310403032688/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 05:00:24 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:00:24 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:00:25 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:25 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4bc002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:00:25 np0005548916 python3.9[227404]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec  6 05:00:25 np0005548916 systemd[1]: Reloading.
Dec  6 05:00:25 np0005548916 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 05:00:25 np0005548916 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 05:00:25 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:00:25 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:00:25 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:00:25.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:00:25 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:00:25 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:00:25 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:00:25.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:00:25 np0005548916 podman[227441]: 2025-12-06 10:00:25.760670438 +0000 UTC m=+0.113092224 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec  6 05:00:25 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:25 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4d4001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:00:25 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:25 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4b8002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:00:26 np0005548916 python3.9[227542]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  6 05:00:26 np0005548916 systemd[1]: Reloading.
Dec  6 05:00:26 np0005548916 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 05:00:26 np0005548916 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 05:00:26 np0005548916 systemd[1]: Starting nova_compute container...
Dec  6 05:00:26 np0005548916 systemd[1]: Started libcrun container.
Dec  6 05:00:26 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fe42b0bad23ec200123940026b03e1add6e1efb6ab3f7ea1cab6155db02ec0c3/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Dec  6 05:00:26 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fe42b0bad23ec200123940026b03e1add6e1efb6ab3f7ea1cab6155db02ec0c3/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Dec  6 05:00:26 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fe42b0bad23ec200123940026b03e1add6e1efb6ab3f7ea1cab6155db02ec0c3/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Dec  6 05:00:26 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fe42b0bad23ec200123940026b03e1add6e1efb6ab3f7ea1cab6155db02ec0c3/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Dec  6 05:00:26 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fe42b0bad23ec200123940026b03e1add6e1efb6ab3f7ea1cab6155db02ec0c3/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Dec  6 05:00:26 np0005548916 podman[227582]: 2025-12-06 10:00:26.747229921 +0000 UTC m=+0.118631609 container init fb9b5b61d2ca2446a0d64f5cf1f50538f7d4513378afdaeab81ced8fe95510ba (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5, name=nova_compute, org.label-schema.license=GPLv2, config_id=edpm, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=nova_compute)
Dec  6 05:00:26 np0005548916 podman[227582]: 2025-12-06 10:00:26.753669477 +0000 UTC m=+0.125071145 container start fb9b5b61d2ca2446a0d64f5cf1f50538f7d4513378afdaeab81ced8fe95510ba (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5, name=nova_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, container_name=nova_compute)
Dec  6 05:00:26 np0005548916 podman[227582]: nova_compute
Dec  6 05:00:26 np0005548916 nova_compute[227597]: + sudo -E kolla_set_configs
Dec  6 05:00:26 np0005548916 systemd[1]: Started nova_compute container.
Dec  6 05:00:26 np0005548916 nova_compute[227597]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec  6 05:00:26 np0005548916 nova_compute[227597]: INFO:__main__:Validating config file
Dec  6 05:00:26 np0005548916 nova_compute[227597]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec  6 05:00:26 np0005548916 nova_compute[227597]: INFO:__main__:Copying service configuration files
Dec  6 05:00:26 np0005548916 nova_compute[227597]: INFO:__main__:Deleting /etc/nova/nova.conf
Dec  6 05:00:26 np0005548916 nova_compute[227597]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Dec  6 05:00:26 np0005548916 nova_compute[227597]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Dec  6 05:00:26 np0005548916 nova_compute[227597]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Dec  6 05:00:26 np0005548916 nova_compute[227597]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Dec  6 05:00:26 np0005548916 nova_compute[227597]: INFO:__main__:Copying /var/lib/kolla/config_files/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf
Dec  6 05:00:26 np0005548916 nova_compute[227597]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf
Dec  6 05:00:26 np0005548916 nova_compute[227597]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Dec  6 05:00:26 np0005548916 nova_compute[227597]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Dec  6 05:00:26 np0005548916 nova_compute[227597]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Dec  6 05:00:26 np0005548916 nova_compute[227597]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Dec  6 05:00:26 np0005548916 nova_compute[227597]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec  6 05:00:26 np0005548916 nova_compute[227597]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec  6 05:00:26 np0005548916 nova_compute[227597]: INFO:__main__:Deleting /etc/ceph
Dec  6 05:00:26 np0005548916 nova_compute[227597]: INFO:__main__:Creating directory /etc/ceph
Dec  6 05:00:26 np0005548916 nova_compute[227597]: INFO:__main__:Setting permission for /etc/ceph
Dec  6 05:00:26 np0005548916 nova_compute[227597]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring
Dec  6 05:00:26 np0005548916 nova_compute[227597]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Dec  6 05:00:26 np0005548916 nova_compute[227597]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.conf to /etc/ceph/ceph.conf
Dec  6 05:00:26 np0005548916 nova_compute[227597]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Dec  6 05:00:26 np0005548916 nova_compute[227597]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Dec  6 05:00:26 np0005548916 nova_compute[227597]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Dec  6 05:00:26 np0005548916 nova_compute[227597]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Dec  6 05:00:26 np0005548916 nova_compute[227597]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Dec  6 05:00:26 np0005548916 nova_compute[227597]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Dec  6 05:00:26 np0005548916 nova_compute[227597]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Dec  6 05:00:26 np0005548916 nova_compute[227597]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Dec  6 05:00:26 np0005548916 nova_compute[227597]: INFO:__main__:Writing out command to execute
Dec  6 05:00:26 np0005548916 nova_compute[227597]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Dec  6 05:00:26 np0005548916 nova_compute[227597]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Dec  6 05:00:26 np0005548916 nova_compute[227597]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Dec  6 05:00:26 np0005548916 nova_compute[227597]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Dec  6 05:00:26 np0005548916 nova_compute[227597]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Dec  6 05:00:26 np0005548916 nova_compute[227597]: ++ cat /run_command
Dec  6 05:00:26 np0005548916 nova_compute[227597]: + CMD=nova-compute
Dec  6 05:00:26 np0005548916 nova_compute[227597]: + ARGS=
Dec  6 05:00:26 np0005548916 nova_compute[227597]: + sudo kolla_copy_cacerts
Dec  6 05:00:26 np0005548916 nova_compute[227597]: + [[ ! -n '' ]]
Dec  6 05:00:26 np0005548916 nova_compute[227597]: + . kolla_extend_start
Dec  6 05:00:26 np0005548916 nova_compute[227597]: + echo 'Running command: '\''nova-compute'\'''
Dec  6 05:00:26 np0005548916 nova_compute[227597]: Running command: 'nova-compute'
Dec  6 05:00:26 np0005548916 nova_compute[227597]: + umask 0022
Dec  6 05:00:26 np0005548916 nova_compute[227597]: + exec nova-compute
Dec  6 05:00:27 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:27 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4c40030f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:00:27 np0005548916 ceph-osd[77465]: bluestore.MempoolThread fragmentation_score=0.000028 took=0.000254s
Dec  6 05:00:27 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:00:27 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:00:27 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:00:27.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:00:27 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:00:27 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:00:27 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:00:27.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:00:27 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:27 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4c40030f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:00:27 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:27 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4d4001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:00:28 np0005548916 python3.9[227759]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  6 05:00:28 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:00:29 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:29 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4dc001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:00:29 np0005548916 python3.9[227910]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  6 05:00:29 np0005548916 nova_compute[227597]: 2025-12-06 10:00:29.318 227601 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Dec  6 05:00:29 np0005548916 nova_compute[227597]: 2025-12-06 10:00:29.319 227601 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Dec  6 05:00:29 np0005548916 nova_compute[227597]: 2025-12-06 10:00:29.319 227601 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Dec  6 05:00:29 np0005548916 nova_compute[227597]: 2025-12-06 10:00:29.320 227601 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m
Dec  6 05:00:29 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:00:29 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:00:29 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:00:29.487 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:00:29 np0005548916 nova_compute[227597]: 2025-12-06 10:00:29.536 227601 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:00:29 np0005548916 nova_compute[227597]: 2025-12-06 10:00:29.554 227601 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.018s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:00:29 np0005548916 nova_compute[227597]: 2025-12-06 10:00:29.555 227601 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Dec  6 05:00:29 np0005548916 ceph-mon[79770]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #37. Immutable memtables: 0.
Dec  6 05:00:29 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:00:29.643778) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec  6 05:00:29 np0005548916 ceph-mon[79770]: rocksdb: [db/flush_job.cc:856] [default] [JOB 19] Flushing memtable with next log file: 37
Dec  6 05:00:29 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015229644824, "job": 19, "event": "flush_started", "num_memtables": 1, "num_entries": 1264, "num_deletes": 251, "total_data_size": 3254279, "memory_usage": 3290384, "flush_reason": "Manual Compaction"}
Dec  6 05:00:29 np0005548916 ceph-mon[79770]: rocksdb: [db/flush_job.cc:885] [default] [JOB 19] Level-0 flush table #38: started
Dec  6 05:00:29 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015229665057, "cf_name": "default", "job": 19, "event": "table_file_creation", "file_number": 38, "file_size": 2113460, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 19476, "largest_seqno": 20735, "table_properties": {"data_size": 2107863, "index_size": 2989, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1541, "raw_key_size": 11905, "raw_average_key_size": 19, "raw_value_size": 2096729, "raw_average_value_size": 3506, "num_data_blocks": 132, "num_entries": 598, "num_filter_entries": 598, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765015121, "oldest_key_time": 1765015121, "file_creation_time": 1765015229, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79d3ff52-4bd2-4fdc-8b55-33dd9c53d8e0", "db_session_id": "1TK25AVRA1WQDS4JHM8T", "orig_file_number": 38, "seqno_to_time_mapping": "N/A"}}
Dec  6 05:00:29 np0005548916 ceph-mon[79770]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 19] Flush lasted 21313 microseconds, and 7957 cpu microseconds.
Dec  6 05:00:29 np0005548916 ceph-mon[79770]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  6 05:00:29 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:00:29.665113) [db/flush_job.cc:967] [default] [JOB 19] Level-0 flush table #38: 2113460 bytes OK
Dec  6 05:00:29 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:00:29.665139) [db/memtable_list.cc:519] [default] Level-0 commit table #38 started
Dec  6 05:00:29 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:00:29.670916) [db/memtable_list.cc:722] [default] Level-0 commit table #38: memtable #1 done
Dec  6 05:00:29 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:00:29.670930) EVENT_LOG_v1 {"time_micros": 1765015229670925, "job": 19, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec  6 05:00:29 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:00:29.670951) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec  6 05:00:29 np0005548916 ceph-mon[79770]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 19] Try to delete WAL files size 3248216, prev total WAL file size 3248216, number of live WAL files 2.
Dec  6 05:00:29 np0005548916 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000034.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 05:00:29 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:00:29.671969) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031323535' seq:72057594037927935, type:22 .. '7061786F730031353037' seq:0, type:0; will stop at (end)
Dec  6 05:00:29 np0005548916 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 20] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec  6 05:00:29 np0005548916 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 19 Base level 0, inputs: [38(2063KB)], [36(13MB)]
Dec  6 05:00:29 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015229672499, "job": 20, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [38], "files_L6": [36], "score": -1, "input_data_size": 16100917, "oldest_snapshot_seqno": -1}
Dec  6 05:00:29 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:00:29 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:00:29 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:00:29.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:00:29 np0005548916 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 20] Generated table #39: 5036 keys, 13954767 bytes, temperature: kUnknown
Dec  6 05:00:29 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015229746546, "cf_name": "default", "job": 20, "event": "table_file_creation", "file_number": 39, "file_size": 13954767, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13919523, "index_size": 21566, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 12613, "raw_key_size": 128338, "raw_average_key_size": 25, "raw_value_size": 13826447, "raw_average_value_size": 2745, "num_data_blocks": 885, "num_entries": 5036, "num_filter_entries": 5036, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765014012, "oldest_key_time": 0, "file_creation_time": 1765015229, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79d3ff52-4bd2-4fdc-8b55-33dd9c53d8e0", "db_session_id": "1TK25AVRA1WQDS4JHM8T", "orig_file_number": 39, "seqno_to_time_mapping": "N/A"}}
Dec  6 05:00:29 np0005548916 ceph-mon[79770]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  6 05:00:29 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:00:29.746874) [db/compaction/compaction_job.cc:1663] [default] [JOB 20] Compacted 1@0 + 1@6 files to L6 => 13954767 bytes
Dec  6 05:00:29 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:00:29.748525) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 217.1 rd, 188.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.0, 13.3 +0.0 blob) out(13.3 +0.0 blob), read-write-amplify(14.2) write-amplify(6.6) OK, records in: 5556, records dropped: 520 output_compression: NoCompression
Dec  6 05:00:29 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:00:29.748542) EVENT_LOG_v1 {"time_micros": 1765015229748534, "job": 20, "event": "compaction_finished", "compaction_time_micros": 74147, "compaction_time_cpu_micros": 34413, "output_level": 6, "num_output_files": 1, "total_output_size": 13954767, "num_input_records": 5556, "num_output_records": 5036, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec  6 05:00:29 np0005548916 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000038.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 05:00:29 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015229748992, "job": 20, "event": "table_file_deletion", "file_number": 38}
Dec  6 05:00:29 np0005548916 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000036.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 05:00:29 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015229751492, "job": 20, "event": "table_file_deletion", "file_number": 36}
Dec  6 05:00:29 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:00:29.671821) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:00:29 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:00:29.751565) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:00:29 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:00:29.751571) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:00:29 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:00:29.751573) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:00:29 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:00:29.751575) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:00:29 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:00:29.751577) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:00:29 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:29 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4dc001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:00:29 np0005548916 python3.9[228064]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  6 05:00:29 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:29 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4c40041f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.197 227601 INFO nova.virt.driver [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.342 227601 INFO nova.compute.provider_config [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.385 227601 DEBUG oslo_concurrency.lockutils [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.385 227601 DEBUG oslo_concurrency.lockutils [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.386 227601 DEBUG oslo_concurrency.lockutils [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.386 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.386 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.387 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.387 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.387 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.387 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.387 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.387 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.387 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.388 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.388 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.388 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.388 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.388 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.388 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.389 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.389 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.389 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.389 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.389 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] console_host                   = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.389 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.389 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.390 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.390 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.390 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.390 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.390 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.390 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.391 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.391 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.391 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.391 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.391 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.391 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.391 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.392 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.392 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.392 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.392 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] host                           = compute-1.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.392 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.392 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.393 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.393 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.393 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.393 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.393 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.393 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.394 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.394 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.394 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.394 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.394 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.394 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.394 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.395 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.395 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.395 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.395 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.395 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.395 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.395 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.396 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.396 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.396 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.396 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.396 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.396 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.396 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.397 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.397 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.397 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.397 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.397 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.397 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.397 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.398 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.398 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.398 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.398 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.398 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.398 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] my_block_storage_ip            = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.398 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] my_ip                          = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.399 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.399 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.399 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.399 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.399 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.399 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.399 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.400 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.400 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.400 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.400 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.400 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.400 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.400 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.401 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.401 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.401 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.401 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.401 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.401 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.401 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.402 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.402 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.402 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.402 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.402 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.402 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.402 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.403 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.403 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.403 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.403 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.403 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.403 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.403 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.404 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.404 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.404 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.404 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.404 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.404 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.404 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.405 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.405 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.405 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.405 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.405 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.405 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.405 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.406 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.406 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.406 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.406 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.406 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.406 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.406 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.407 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.407 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.407 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.407 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.407 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.407 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.407 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.408 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.408 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.408 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.408 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.408 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.408 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.408 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.409 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.409 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.409 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.409 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.409 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.409 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.410 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.410 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.410 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.410 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.410 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.410 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.411 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.411 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.411 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.411 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.411 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.411 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.411 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.412 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.412 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.412 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.412 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.412 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.412 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.413 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.413 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.413 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.413 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.413 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.413 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.413 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.414 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.414 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.414 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.414 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.414 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.414 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.414 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.415 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.415 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.415 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.415 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.415 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.415 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.415 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.416 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.416 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.416 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.416 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.416 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.416 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.417 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.417 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.417 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.417 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.417 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.417 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.417 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.418 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.418 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.418 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.418 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.418 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.418 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.419 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.419 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.419 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.419 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.419 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.419 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.420 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.420 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.420 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.420 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.420 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.420 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.420 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.421 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.421 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.421 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.421 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.421 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.421 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.422 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.422 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.468 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.469 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.469 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.470 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.470 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.470 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.470 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.470 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.470 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.471 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.471 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.471 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.471 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.471 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.471 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.471 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.472 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.472 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.472 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.472 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.472 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.472 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.473 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.473 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.473 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.473 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.473 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.473 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.473 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.474 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.474 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.474 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.474 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.474 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.474 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.474 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.475 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.475 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.475 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.475 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.475 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.475 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.476 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.476 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.476 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.476 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.476 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.476 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.476 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.477 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.477 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.477 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.477 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.477 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.477 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.477 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.478 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.478 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.478 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.478 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.478 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.478 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.478 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.479 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.479 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.479 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.479 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.479 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.479 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.479 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.480 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.480 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.480 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.480 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.480 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.480 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.481 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.481 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.481 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.481 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.481 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.481 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.481 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.482 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.482 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.482 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.482 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.482 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.482 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.483 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.483 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.483 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.483 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.483 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.483 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.484 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.484 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.484 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.484 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.484 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.484 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.485 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.485 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.485 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.485 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.485 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.485 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.486 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.486 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.486 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.486 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.486 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.486 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.486 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.487 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.487 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.487 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.487 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.488 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.488 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.488 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.488 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.488 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.488 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.489 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.489 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.489 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.489 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.489 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.489 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.489 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.490 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.490 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.490 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.490 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.490 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.490 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.490 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.491 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.491 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.491 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.491 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.491 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.491 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.492 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.492 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.492 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.492 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.492 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.492 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.493 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.493 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.493 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.493 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.493 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.493 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.493 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.494 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.494 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.494 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.494 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.494 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.494 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.494 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.495 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.495 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.495 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.495 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.495 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.495 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.495 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.496 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.496 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.496 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.496 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.496 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.496 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.497 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.497 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.497 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.497 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.497 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.497 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.497 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.498 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.498 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.498 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.498 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.498 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.499 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.499 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.499 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.499 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.499 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.499 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.499 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.500 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.500 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.500 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.500 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.500 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.500 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.501 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.501 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.501 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.501 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.501 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.502 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.502 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.502 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.502 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.502 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.502 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.503 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.503 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.503 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.503 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.503 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.503 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.504 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.504 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.504 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.504 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.504 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.504 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.504 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.505 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.505 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.505 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.505 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.505 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.images_rbd_ceph_conf   = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.505 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.505 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.506 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.506 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.images_rbd_pool        = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.506 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.images_type            = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.506 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.506 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.506 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.506 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.507 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.507 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.507 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.507 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.507 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.507 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.508 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.508 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.508 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.508 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.508 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.509 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.509 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.509 227601 WARNING oslo_config.cfg [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Dec  6 05:00:30 np0005548916 nova_compute[227597]: live_migration_uri is deprecated for removal in favor of two other options that
Dec  6 05:00:30 np0005548916 nova_compute[227597]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Dec  6 05:00:30 np0005548916 nova_compute[227597]: and ``live_migration_inbound_addr`` respectively.
Dec  6 05:00:30 np0005548916 nova_compute[227597]: ).  Its value may be silently ignored in the future.#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.509 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.509 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.510 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.510 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.510 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.510 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.510 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.511 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.511 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.511 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.511 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.511 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.511 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.511 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.512 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.512 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.512 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.512 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.512 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.rbd_secret_uuid        = 5ecd3f74-dade-5fc4-92ce-8950ae424258 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.513 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.rbd_user               = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.513 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.513 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.513 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.513 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.513 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.513 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.514 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.514 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.514 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.514 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.514 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.514 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.515 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.515 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.515 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.515 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.515 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.515 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.515 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.516 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.516 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.516 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.516 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.516 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.516 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.516 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.517 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.517 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.517 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.517 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.517 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.517 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.517 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.518 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.518 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.518 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.518 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.518 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.518 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.519 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.519 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.519 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.519 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.519 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.519 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.519 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.520 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.520 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.520 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.520 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.520 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.520 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.520 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.521 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.521 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.521 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.521 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.521 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.522 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.522 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.522 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.522 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.522 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.522 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.523 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.523 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.523 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.523 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.523 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.523 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.524 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.524 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.524 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.524 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.524 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.524 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.525 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.525 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.525 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.525 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.525 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.525 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.526 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.526 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.526 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.526 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.526 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.526 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.526 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.527 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.527 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.527 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.527 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.527 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.527 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.528 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.528 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.528 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.528 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.528 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.528 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.528 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.529 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.529 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.529 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.529 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.529 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.529 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.530 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.530 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.530 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.530 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.530 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.530 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.530 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.531 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.531 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.531 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.531 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.531 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.532 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.532 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.532 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.532 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.532 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.532 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.533 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.533 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.533 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.533 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.533 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.533 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.534 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.534 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.534 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.534 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.534 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.535 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.535 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.535 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.535 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.535 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.535 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.536 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.536 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.536 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.536 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.536 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.537 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.537 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.537 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.537 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.537 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.537 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.538 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.538 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.538 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.538 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.538 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.538 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.539 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.539 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.539 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.539 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.539 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.539 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.540 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.540 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.540 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.540 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.540 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.540 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.540 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.541 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.541 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.541 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.541 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.541 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.541 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.542 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.542 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.542 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.542 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.542 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.542 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.542 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.543 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.543 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.543 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.543 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.543 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.543 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.543 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.544 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.544 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.544 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.544 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.544 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.544 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.545 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.545 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.545 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.545 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.545 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.546 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.546 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.546 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.546 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.546 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.546 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.546 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.547 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.547 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.547 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.547 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.547 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.547 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.547 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.548 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.548 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.548 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.548 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.548 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.548 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.548 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.549 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.549 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.549 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.549 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.549 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.550 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.550 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.550 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vnc.server_proxyclient_address = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.550 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.550 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.550 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.551 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.551 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.551 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.551 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.551 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.551 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.551 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.552 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.552 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.552 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.552 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.552 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.552 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.553 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.553 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.553 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.553 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.553 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.553 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.553 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.554 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.554 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.554 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.554 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.554 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.554 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.555 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.555 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.555 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.555 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.555 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.555 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.556 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.556 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.556 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.556 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.556 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.556 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.557 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.557 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.557 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.557 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.557 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.557 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.558 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.558 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.558 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.558 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.558 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.558 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.559 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.559 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.559 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.559 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.559 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.559 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.560 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.560 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.560 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.560 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.560 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.560 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.561 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.561 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.561 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.561 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.561 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.561 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.561 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.562 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.562 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.562 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.562 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.562 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.562 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.563 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.563 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.563 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.563 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.563 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.563 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.564 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.564 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.564 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.564 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.564 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.564 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.565 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.565 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.565 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.565 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.565 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.565 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.566 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.566 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.566 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.566 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.566 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.566 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.566 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.567 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.567 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.567 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.567 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.567 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.567 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.567 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.568 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.568 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.568 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.568 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.568 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.568 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.568 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.569 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.569 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.569 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.569 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.569 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.569 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.569 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.570 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.570 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.570 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.570 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.570 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.570 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.570 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.571 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.571 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.571 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.571 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.571 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.571 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.571 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.572 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.572 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.572 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.572 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.572 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.572 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.573 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.573 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.573 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.573 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.573 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.573 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.574 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.574 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.574 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.574 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.574 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.574 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.574 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.574 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.575 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.575 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.575 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.575 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.575 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.575 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.575 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.576 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.576 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.576 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.576 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.576 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.576 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.578 227601 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.590 227601 DEBUG nova.virt.libvirt.host [None req-cea1e880-d63b-4a0b-8d20-816526471ae0 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.591 227601 DEBUG nova.virt.libvirt.host [None req-cea1e880-d63b-4a0b-8d20-816526471ae0 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.591 227601 DEBUG nova.virt.libvirt.host [None req-cea1e880-d63b-4a0b-8d20-816526471ae0 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.592 227601 DEBUG nova.virt.libvirt.host [None req-cea1e880-d63b-4a0b-8d20-816526471ae0 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503#033[00m
Dec  6 05:00:30 np0005548916 systemd[1]: Starting libvirt QEMU daemon...
Dec  6 05:00:30 np0005548916 systemd[1]: Started libvirt QEMU daemon.
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.672 227601 DEBUG nova.virt.libvirt.host [None req-cea1e880-d63b-4a0b-8d20-816526471ae0 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f5c469a0910> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.676 227601 DEBUG nova.virt.libvirt.host [None req-cea1e880-d63b-4a0b-8d20-816526471ae0 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f5c469a0910> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.677 227601 INFO nova.virt.libvirt.driver [None req-cea1e880-d63b-4a0b-8d20-816526471ae0 - - - - - -] Connection event '1' reason 'None'#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.699 227601 WARNING nova.virt.libvirt.driver [None req-cea1e880-d63b-4a0b-8d20-816526471ae0 - - - - - -] Cannot update service status on host "compute-1.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-1.ctlplane.example.com could not be found.#033[00m
Dec  6 05:00:30 np0005548916 nova_compute[227597]: 2025-12-06 10:00:30.700 227601 DEBUG nova.virt.libvirt.volume.mount [None req-cea1e880-d63b-4a0b-8d20-816526471ae0 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130#033[00m
Dec  6 05:00:30 np0005548916 podman[228274]: 2025-12-06 10:00:30.939224232 +0000 UTC m=+0.126843398 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent)
Dec  6 05:00:31 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:31 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4d4001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:00:31 np0005548916 python3.9[228305]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Dec  6 05:00:31 np0005548916 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec  6 05:00:31 np0005548916 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec  6 05:00:31 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:00:31 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:00:31 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:00:31.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:00:31 np0005548916 nova_compute[227597]: 2025-12-06 10:00:31.650 227601 INFO nova.virt.libvirt.host [None req-cea1e880-d63b-4a0b-8d20-816526471ae0 - - - - - -] Libvirt host capabilities <capabilities>
Dec  6 05:00:31 np0005548916 nova_compute[227597]: 
Dec  6 05:00:31 np0005548916 nova_compute[227597]:  <host>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <uuid>9a5f3f62-e1ed-4c63-8d00-a3c5e56bbddc</uuid>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <cpu>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <arch>x86_64</arch>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model>EPYC-Rome-v4</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <vendor>AMD</vendor>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <microcode version='16777317'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <signature family='23' model='49' stepping='0'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <maxphysaddr mode='emulate' bits='40'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <feature name='x2apic'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <feature name='tsc-deadline'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <feature name='osxsave'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <feature name='hypervisor'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <feature name='tsc_adjust'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <feature name='spec-ctrl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <feature name='stibp'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <feature name='arch-capabilities'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <feature name='ssbd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <feature name='cmp_legacy'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <feature name='topoext'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <feature name='virt-ssbd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <feature name='lbrv'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <feature name='tsc-scale'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <feature name='vmcb-clean'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <feature name='pause-filter'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <feature name='pfthreshold'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <feature name='svme-addr-chk'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <feature name='rdctl-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <feature name='skip-l1dfl-vmentry'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <feature name='mds-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <feature name='pschange-mc-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <pages unit='KiB' size='4'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <pages unit='KiB' size='2048'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <pages unit='KiB' size='1048576'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    </cpu>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <power_management>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <suspend_mem/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    </power_management>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <iommu support='no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <migration_features>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <live/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <uri_transports>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <uri_transport>tcp</uri_transport>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <uri_transport>rdma</uri_transport>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </uri_transports>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    </migration_features>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <topology>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <cells num='1'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <cell id='0'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:          <memory unit='KiB'>7864312</memory>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:          <pages unit='KiB' size='4'>1966078</pages>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:          <pages unit='KiB' size='2048'>0</pages>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:          <pages unit='KiB' size='1048576'>0</pages>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:          <distances>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:            <sibling id='0' value='10'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:          </distances>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:          <cpus num='8'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:            <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:            <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:            <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:            <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:            <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:            <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:            <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:            <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:          </cpus>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        </cell>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </cells>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    </topology>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <cache>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    </cache>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <secmodel>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model>selinux</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <doi>0</doi>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    </secmodel>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <secmodel>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model>dac</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <doi>0</doi>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <baselabel type='kvm'>+107:+107</baselabel>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <baselabel type='qemu'>+107:+107</baselabel>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    </secmodel>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:  </host>
Dec  6 05:00:31 np0005548916 nova_compute[227597]: 
Dec  6 05:00:31 np0005548916 nova_compute[227597]:  <guest>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <os_type>hvm</os_type>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <arch name='i686'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <wordsize>32</wordsize>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <domain type='qemu'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <domain type='kvm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    </arch>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <features>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <pae/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <nonpae/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <acpi default='on' toggle='yes'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <apic default='on' toggle='no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <cpuselection/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <deviceboot/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <disksnapshot default='on' toggle='no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <externalSnapshot/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    </features>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:  </guest>
Dec  6 05:00:31 np0005548916 nova_compute[227597]: 
Dec  6 05:00:31 np0005548916 nova_compute[227597]:  <guest>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <os_type>hvm</os_type>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <arch name='x86_64'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <wordsize>64</wordsize>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <domain type='qemu'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <domain type='kvm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    </arch>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <features>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <acpi default='on' toggle='yes'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <apic default='on' toggle='no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <cpuselection/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <deviceboot/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <disksnapshot default='on' toggle='no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <externalSnapshot/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    </features>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:  </guest>
Dec  6 05:00:31 np0005548916 nova_compute[227597]: 
Dec  6 05:00:31 np0005548916 nova_compute[227597]: </capabilities>
Dec  6 05:00:31 np0005548916 nova_compute[227597]: #033[00m
Dec  6 05:00:31 np0005548916 nova_compute[227597]: 2025-12-06 10:00:31.657 227601 DEBUG nova.virt.libvirt.host [None req-cea1e880-d63b-4a0b-8d20-816526471ae0 - - - - - -] Getting domain capabilities for i686 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Dec  6 05:00:31 np0005548916 nova_compute[227597]: 2025-12-06 10:00:31.678 227601 DEBUG nova.virt.libvirt.host [None req-cea1e880-d63b-4a0b-8d20-816526471ae0 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Dec  6 05:00:31 np0005548916 nova_compute[227597]: <domainCapabilities>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:  <path>/usr/libexec/qemu-kvm</path>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:  <domain>kvm</domain>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:  <machine>pc-i440fx-rhel7.6.0</machine>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:  <arch>i686</arch>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:  <vcpu max='240'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:  <iothreads supported='yes'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:  <os supported='yes'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <enum name='firmware'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <loader supported='yes'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='type'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>rom</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>pflash</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='readonly'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>yes</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>no</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='secure'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>no</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    </loader>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:  </os>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:  <cpu>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <mode name='host-passthrough' supported='yes'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='hostPassthroughMigratable'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>on</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>off</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    </mode>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <mode name='maximum' supported='yes'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='maximumMigratable'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>on</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>off</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    </mode>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <mode name='host-model' supported='yes'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model fallback='forbid'>EPYC-Rome</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <vendor>AMD</vendor>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <maxphysaddr mode='passthrough' limit='40'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <feature policy='require' name='x2apic'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <feature policy='require' name='tsc-deadline'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <feature policy='require' name='hypervisor'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <feature policy='require' name='tsc_adjust'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <feature policy='require' name='spec-ctrl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <feature policy='require' name='stibp'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <feature policy='require' name='ssbd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <feature policy='require' name='cmp_legacy'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <feature policy='require' name='overflow-recov'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <feature policy='require' name='succor'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <feature policy='require' name='ibrs'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <feature policy='require' name='amd-ssbd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <feature policy='require' name='virt-ssbd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <feature policy='require' name='lbrv'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <feature policy='require' name='tsc-scale'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <feature policy='require' name='vmcb-clean'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <feature policy='require' name='flushbyasid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <feature policy='require' name='pause-filter'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <feature policy='require' name='pfthreshold'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <feature policy='require' name='svme-addr-chk'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <feature policy='require' name='lfence-always-serializing'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <feature policy='disable' name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    </mode>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <mode name='custom' supported='yes'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Broadwell'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Broadwell-IBRS'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Broadwell-noTSX'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Broadwell-noTSX-IBRS'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Broadwell-v1'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Broadwell-v2'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Broadwell-v3'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Broadwell-v4'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Cascadelake-Server'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Cascadelake-Server-noTSX'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Cascadelake-Server-v1'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Cascadelake-Server-v2'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Cascadelake-Server-v3'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Cascadelake-Server-v4'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Cascadelake-Server-v5'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Cooperlake'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-bf16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Cooperlake-v1'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-bf16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Cooperlake-v2'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-bf16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Denverton'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='mpx'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Denverton-v1'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='mpx'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Denverton-v2'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Denverton-v3'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Dhyana-v2'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='EPYC-Genoa'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='amd-psfd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='auto-ibrs'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-bf16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512ifma'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='no-nested-data-bp'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='null-sel-clr-base'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='stibp-always-on'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='EPYC-Genoa-v1'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='amd-psfd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='auto-ibrs'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-bf16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512ifma'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='no-nested-data-bp'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='null-sel-clr-base'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='stibp-always-on'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='EPYC-Milan'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='EPYC-Milan-v1'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='EPYC-Milan-v2'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='amd-psfd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='no-nested-data-bp'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='null-sel-clr-base'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='stibp-always-on'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='EPYC-Rome'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='EPYC-Rome-v1'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='EPYC-Rome-v2'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='EPYC-Rome-v3'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='EPYC-v3'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='EPYC-v4'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='GraniteRapids'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='amx-bf16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='amx-fp16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='amx-int8'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='amx-tile'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx-vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-bf16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-fp16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512ifma'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fbsdp-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrc'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrs'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fzrm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='mcdt-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pbrsb-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='prefetchiti'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='psdp-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='sbdr-ssdp-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='serialize'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='tsx-ldtrk'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xfd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='GraniteRapids-v1'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='amx-bf16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='amx-fp16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='amx-int8'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='amx-tile'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx-vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-bf16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-fp16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512ifma'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fbsdp-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrc'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrs'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fzrm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='mcdt-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pbrsb-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='prefetchiti'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='psdp-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='sbdr-ssdp-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='serialize'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='tsx-ldtrk'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xfd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='GraniteRapids-v2'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='amx-bf16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='amx-fp16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='amx-int8'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='amx-tile'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx-vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx10'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx10-128'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx10-256'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx10-512'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-bf16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-fp16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512ifma'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='cldemote'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fbsdp-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrc'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrs'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fzrm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='mcdt-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='movdir64b'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='movdiri'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pbrsb-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='prefetchiti'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='psdp-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='sbdr-ssdp-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='serialize'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ss'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='tsx-ldtrk'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xfd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Haswell'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Haswell-IBRS'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Haswell-noTSX'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Haswell-noTSX-IBRS'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Haswell-v1'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Haswell-v2'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Haswell-v3'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Haswell-v4'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Icelake-Server'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Icelake-Server-noTSX'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Icelake-Server-v1'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Icelake-Server-v2'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Icelake-Server-v3'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Icelake-Server-v4'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512ifma'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Icelake-Server-v5'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512ifma'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Icelake-Server-v6'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512ifma'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Icelake-Server-v7'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512ifma'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='IvyBridge'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='IvyBridge-IBRS'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='IvyBridge-v1'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='IvyBridge-v2'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='KnightsMill'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-4fmaps'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-4vnniw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512er'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512pf'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ss'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='KnightsMill-v1'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-4fmaps'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-4vnniw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512er'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512pf'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ss'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Opteron_G4'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fma4'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xop'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Opteron_G4-v1'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fma4'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xop'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Opteron_G5'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fma4'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='tbm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xop'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Opteron_G5-v1'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fma4'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='tbm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xop'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='SapphireRapids'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='amx-bf16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='amx-int8'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='amx-tile'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx-vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-bf16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-fp16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512ifma'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrc'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrs'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fzrm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='serialize'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='tsx-ldtrk'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xfd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='SapphireRapids-v1'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='amx-bf16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='amx-int8'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='amx-tile'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx-vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-bf16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-fp16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512ifma'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrc'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrs'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fzrm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='serialize'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='tsx-ldtrk'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xfd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='SapphireRapids-v2'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='amx-bf16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='amx-int8'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='amx-tile'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx-vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-bf16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-fp16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512ifma'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fbsdp-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrc'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrs'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fzrm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='psdp-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='sbdr-ssdp-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='serialize'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='tsx-ldtrk'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xfd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='SapphireRapids-v3'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='amx-bf16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='amx-int8'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='amx-tile'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx-vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-bf16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-fp16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512ifma'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='cldemote'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fbsdp-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrc'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrs'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fzrm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='movdir64b'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='movdiri'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='psdp-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='sbdr-ssdp-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='serialize'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ss'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='tsx-ldtrk'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xfd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='SierraForest'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx-ifma'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx-ne-convert'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx-vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx-vnni-int8'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='cmpccxadd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fbsdp-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrs'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='mcdt-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pbrsb-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='psdp-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='sbdr-ssdp-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='serialize'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='SierraForest-v1'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx-ifma'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx-ne-convert'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx-vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx-vnni-int8'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='cmpccxadd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fbsdp-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrs'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='mcdt-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pbrsb-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='psdp-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='sbdr-ssdp-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='serialize'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Skylake-Client'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Skylake-Client-IBRS'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Skylake-Client-v1'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Skylake-Client-v2'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Skylake-Client-v3'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Skylake-Client-v4'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Skylake-Server'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Skylake-Server-IBRS'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Skylake-Server-v1'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Skylake-Server-v2'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Skylake-Server-v3'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Skylake-Server-v4'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Skylake-Server-v5'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Snowridge'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='cldemote'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='core-capability'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='movdir64b'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='movdiri'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='mpx'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='split-lock-detect'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Snowridge-v1'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='cldemote'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='core-capability'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='movdir64b'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='movdiri'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='mpx'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='split-lock-detect'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Snowridge-v2'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='cldemote'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='core-capability'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='movdir64b'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='movdiri'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='split-lock-detect'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Snowridge-v3'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='cldemote'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='core-capability'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='movdir64b'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='movdiri'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='split-lock-detect'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Snowridge-v4'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='cldemote'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='movdir64b'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='movdiri'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='athlon'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='3dnow'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='3dnowext'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='athlon-v1'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='3dnow'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='3dnowext'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='core2duo'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ss'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='core2duo-v1'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ss'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='coreduo'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ss'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='coreduo-v1'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ss'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='n270'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ss'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='n270-v1'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ss'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='phenom'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='3dnow'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='3dnowext'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='phenom-v1'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='3dnow'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='3dnowext'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    </mode>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:  </cpu>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:  <memoryBacking supported='yes'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <enum name='sourceType'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <value>file</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <value>anonymous</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <value>memfd</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:  </memoryBacking>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:  <devices>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <disk supported='yes'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='diskDevice'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>disk</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>cdrom</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>floppy</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>lun</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='bus'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>ide</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>fdc</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>scsi</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>virtio</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>usb</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>sata</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='model'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>virtio</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>virtio-transitional</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>virtio-non-transitional</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    </disk>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <graphics supported='yes'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='type'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>vnc</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>egl-headless</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>dbus</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    </graphics>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <video supported='yes'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='modelType'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>vga</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>cirrus</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>virtio</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>none</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>bochs</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>ramfb</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    </video>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <hostdev supported='yes'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='mode'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>subsystem</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='startupPolicy'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>default</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>mandatory</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>requisite</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>optional</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='subsysType'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>usb</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>pci</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>scsi</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='capsType'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='pciBackend'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    </hostdev>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <rng supported='yes'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='model'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>virtio</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>virtio-transitional</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>virtio-non-transitional</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='backendModel'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>random</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>egd</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>builtin</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    </rng>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <filesystem supported='yes'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='driverType'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>path</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>handle</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>virtiofs</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    </filesystem>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <tpm supported='yes'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='model'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>tpm-tis</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>tpm-crb</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='backendModel'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>emulator</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>external</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='backendVersion'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>2.0</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    </tpm>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <redirdev supported='yes'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='bus'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>usb</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    </redirdev>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <channel supported='yes'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='type'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>pty</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>unix</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    </channel>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <crypto supported='yes'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='model'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='type'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>qemu</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='backendModel'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>builtin</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    </crypto>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <interface supported='yes'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='backendType'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>default</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>passt</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    </interface>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <panic supported='yes'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='model'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>isa</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>hyperv</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    </panic>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <console supported='yes'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='type'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>null</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>vc</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>pty</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>dev</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>file</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>pipe</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>stdio</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>udp</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>tcp</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>unix</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>qemu-vdagent</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>dbus</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    </console>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:  </devices>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:  <features>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <gic supported='no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <vmcoreinfo supported='yes'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <genid supported='yes'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <backingStoreInput supported='yes'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <backup supported='yes'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <async-teardown supported='yes'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <ps2 supported='yes'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <sev supported='no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <sgx supported='no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <hyperv supported='yes'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='features'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>relaxed</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>vapic</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>spinlocks</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>vpindex</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>runtime</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>synic</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>stimer</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>reset</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>vendor_id</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>frequencies</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>reenlightenment</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>tlbflush</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>ipi</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>avic</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>emsr_bitmap</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>xmm_input</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <defaults>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <spinlocks>4095</spinlocks>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <stimer_direct>on</stimer_direct>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <tlbflush_direct>on</tlbflush_direct>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <tlbflush_extended>on</tlbflush_extended>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <vendor_id>Linux KVM Hv</vendor_id>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </defaults>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    </hyperv>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <launchSecurity supported='yes'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='sectype'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>tdx</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    </launchSecurity>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:  </features>
Dec  6 05:00:31 np0005548916 nova_compute[227597]: </domainCapabilities>
Dec  6 05:00:31 np0005548916 nova_compute[227597]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Dec  6 05:00:31 np0005548916 nova_compute[227597]: 2025-12-06 10:00:31.685 227601 DEBUG nova.virt.libvirt.host [None req-cea1e880-d63b-4a0b-8d20-816526471ae0 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Dec  6 05:00:31 np0005548916 nova_compute[227597]: <domainCapabilities>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:  <path>/usr/libexec/qemu-kvm</path>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:  <domain>kvm</domain>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:  <machine>pc-q35-rhel9.8.0</machine>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:  <arch>i686</arch>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:  <vcpu max='4096'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:  <iothreads supported='yes'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:  <os supported='yes'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <enum name='firmware'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <loader supported='yes'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='type'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>rom</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>pflash</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='readonly'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>yes</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>no</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='secure'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>no</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    </loader>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:  </os>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:  <cpu>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <mode name='host-passthrough' supported='yes'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='hostPassthroughMigratable'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>on</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>off</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    </mode>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <mode name='maximum' supported='yes'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='maximumMigratable'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>on</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>off</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    </mode>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <mode name='host-model' supported='yes'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model fallback='forbid'>EPYC-Rome</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <vendor>AMD</vendor>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <maxphysaddr mode='passthrough' limit='40'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <feature policy='require' name='x2apic'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <feature policy='require' name='tsc-deadline'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <feature policy='require' name='hypervisor'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <feature policy='require' name='tsc_adjust'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <feature policy='require' name='spec-ctrl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <feature policy='require' name='stibp'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <feature policy='require' name='ssbd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <feature policy='require' name='cmp_legacy'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <feature policy='require' name='overflow-recov'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <feature policy='require' name='succor'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <feature policy='require' name='ibrs'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <feature policy='require' name='amd-ssbd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <feature policy='require' name='virt-ssbd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <feature policy='require' name='lbrv'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <feature policy='require' name='tsc-scale'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <feature policy='require' name='vmcb-clean'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <feature policy='require' name='flushbyasid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <feature policy='require' name='pause-filter'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <feature policy='require' name='pfthreshold'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <feature policy='require' name='svme-addr-chk'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <feature policy='require' name='lfence-always-serializing'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <feature policy='disable' name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    </mode>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <mode name='custom' supported='yes'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Broadwell'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Broadwell-IBRS'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Broadwell-noTSX'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Broadwell-noTSX-IBRS'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Broadwell-v1'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Broadwell-v2'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Broadwell-v3'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Broadwell-v4'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Cascadelake-Server'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Cascadelake-Server-noTSX'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:00:31.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Cascadelake-Server-v1'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Cascadelake-Server-v2'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Cascadelake-Server-v3'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Cascadelake-Server-v4'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Cascadelake-Server-v5'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Cooperlake'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-bf16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Cooperlake-v1'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-bf16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Cooperlake-v2'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-bf16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Denverton'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='mpx'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Denverton-v1'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='mpx'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Denverton-v2'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Denverton-v3'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Dhyana-v2'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='EPYC-Genoa'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='amd-psfd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='auto-ibrs'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-bf16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512ifma'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='no-nested-data-bp'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='null-sel-clr-base'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='stibp-always-on'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='EPYC-Genoa-v1'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='amd-psfd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='auto-ibrs'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-bf16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512ifma'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='no-nested-data-bp'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='null-sel-clr-base'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='stibp-always-on'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='EPYC-Milan'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='EPYC-Milan-v1'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='EPYC-Milan-v2'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='amd-psfd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='no-nested-data-bp'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='null-sel-clr-base'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='stibp-always-on'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='EPYC-Rome'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='EPYC-Rome-v1'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='EPYC-Rome-v2'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='EPYC-Rome-v3'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='EPYC-v3'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='EPYC-v4'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='GraniteRapids'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='amx-bf16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='amx-fp16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='amx-int8'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='amx-tile'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx-vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-bf16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-fp16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512ifma'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fbsdp-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrc'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrs'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fzrm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='mcdt-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pbrsb-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='prefetchiti'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='psdp-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='sbdr-ssdp-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='serialize'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='tsx-ldtrk'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xfd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='GraniteRapids-v1'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='amx-bf16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='amx-fp16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='amx-int8'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='amx-tile'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx-vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-bf16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-fp16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512ifma'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fbsdp-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrc'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrs'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fzrm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='mcdt-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pbrsb-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='prefetchiti'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='psdp-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='sbdr-ssdp-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='serialize'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='tsx-ldtrk'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xfd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='GraniteRapids-v2'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='amx-bf16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='amx-fp16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='amx-int8'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='amx-tile'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx-vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx10'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx10-128'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx10-256'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx10-512'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-bf16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-fp16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512ifma'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='cldemote'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fbsdp-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrc'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrs'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fzrm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='mcdt-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='movdir64b'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='movdiri'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pbrsb-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='prefetchiti'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='psdp-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='sbdr-ssdp-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='serialize'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ss'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='tsx-ldtrk'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xfd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Haswell'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Haswell-IBRS'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Haswell-noTSX'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Haswell-noTSX-IBRS'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Haswell-v1'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Haswell-v2'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Haswell-v3'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Haswell-v4'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Icelake-Server'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Icelake-Server-noTSX'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Icelake-Server-v1'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Icelake-Server-v2'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Icelake-Server-v3'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Icelake-Server-v4'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512ifma'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Icelake-Server-v5'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512ifma'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Icelake-Server-v6'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512ifma'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Icelake-Server-v7'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512ifma'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='IvyBridge'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='IvyBridge-IBRS'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='IvyBridge-v1'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='IvyBridge-v2'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='KnightsMill'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-4fmaps'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-4vnniw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512er'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512pf'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ss'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='KnightsMill-v1'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-4fmaps'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-4vnniw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512er'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512pf'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ss'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Opteron_G4'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fma4'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xop'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Opteron_G4-v1'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fma4'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xop'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Opteron_G5'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fma4'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='tbm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xop'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Opteron_G5-v1'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fma4'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='tbm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xop'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='SapphireRapids'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='amx-bf16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='amx-int8'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='amx-tile'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx-vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-bf16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-fp16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512ifma'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrc'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrs'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fzrm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='serialize'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='tsx-ldtrk'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xfd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='SapphireRapids-v1'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='amx-bf16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='amx-int8'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='amx-tile'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx-vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-bf16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-fp16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512ifma'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrc'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrs'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fzrm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='serialize'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='tsx-ldtrk'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xfd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='SapphireRapids-v2'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='amx-bf16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='amx-int8'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='amx-tile'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx-vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-bf16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-fp16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512ifma'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fbsdp-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrc'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrs'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fzrm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='psdp-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='sbdr-ssdp-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='serialize'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='tsx-ldtrk'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xfd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='SapphireRapids-v3'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='amx-bf16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='amx-int8'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='amx-tile'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx-vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-bf16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-fp16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512ifma'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='cldemote'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fbsdp-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrc'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrs'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fzrm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='movdir64b'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='movdiri'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='psdp-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='sbdr-ssdp-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='serialize'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ss'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='tsx-ldtrk'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xfd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='SierraForest'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx-ifma'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx-ne-convert'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx-vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx-vnni-int8'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='cmpccxadd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fbsdp-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrs'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='mcdt-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pbrsb-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='psdp-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='sbdr-ssdp-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='serialize'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='SierraForest-v1'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx-ifma'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx-ne-convert'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx-vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx-vnni-int8'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='cmpccxadd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fbsdp-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrs'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='mcdt-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pbrsb-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='psdp-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='sbdr-ssdp-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='serialize'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Skylake-Client'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Skylake-Client-IBRS'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Skylake-Client-v1'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Skylake-Client-v2'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Skylake-Client-v3'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Skylake-Client-v4'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Skylake-Server'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Skylake-Server-IBRS'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Skylake-Server-v1'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Skylake-Server-v2'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Skylake-Server-v3'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Skylake-Server-v4'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Skylake-Server-v5'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Snowridge'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='cldemote'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='core-capability'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='movdir64b'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='movdiri'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='mpx'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='split-lock-detect'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Snowridge-v1'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='cldemote'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='core-capability'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='movdir64b'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='movdiri'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='mpx'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='split-lock-detect'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Snowridge-v2'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='cldemote'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='core-capability'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='movdir64b'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='movdiri'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='split-lock-detect'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Snowridge-v3'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='cldemote'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='core-capability'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='movdir64b'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='movdiri'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='split-lock-detect'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Snowridge-v4'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='cldemote'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='movdir64b'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='movdiri'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='athlon'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='3dnow'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='3dnowext'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='athlon-v1'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='3dnow'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='3dnowext'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='core2duo'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ss'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='core2duo-v1'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ss'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='coreduo'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ss'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='coreduo-v1'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ss'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='n270'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ss'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='n270-v1'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ss'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='phenom'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='3dnow'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='3dnowext'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='phenom-v1'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='3dnow'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='3dnowext'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    </mode>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:  </cpu>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:  <memoryBacking supported='yes'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <enum name='sourceType'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <value>file</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <value>anonymous</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <value>memfd</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:  </memoryBacking>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:  <devices>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <disk supported='yes'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='diskDevice'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>disk</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>cdrom</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>floppy</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>lun</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='bus'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>fdc</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>scsi</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>virtio</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>usb</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>sata</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='model'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>virtio</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>virtio-transitional</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>virtio-non-transitional</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    </disk>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <graphics supported='yes'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='type'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>vnc</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>egl-headless</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>dbus</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    </graphics>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <video supported='yes'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='modelType'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>vga</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>cirrus</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>virtio</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>none</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>bochs</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>ramfb</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    </video>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <hostdev supported='yes'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='mode'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>subsystem</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='startupPolicy'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>default</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>mandatory</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>requisite</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>optional</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='subsysType'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>usb</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>pci</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>scsi</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='capsType'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='pciBackend'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    </hostdev>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <rng supported='yes'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='model'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>virtio</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>virtio-transitional</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>virtio-non-transitional</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='backendModel'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>random</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>egd</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>builtin</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    </rng>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <filesystem supported='yes'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='driverType'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>path</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>handle</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>virtiofs</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    </filesystem>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <tpm supported='yes'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='model'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>tpm-tis</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>tpm-crb</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='backendModel'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>emulator</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>external</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='backendVersion'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>2.0</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    </tpm>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <redirdev supported='yes'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='bus'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>usb</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    </redirdev>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <channel supported='yes'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='type'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>pty</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>unix</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    </channel>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <crypto supported='yes'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='model'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='type'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>qemu</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='backendModel'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>builtin</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    </crypto>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <interface supported='yes'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='backendType'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>default</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>passt</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    </interface>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <panic supported='yes'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='model'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>isa</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>hyperv</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    </panic>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <console supported='yes'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='type'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>null</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>vc</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>pty</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>dev</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>file</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>pipe</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>stdio</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>udp</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>tcp</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>unix</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>qemu-vdagent</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>dbus</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    </console>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:  </devices>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:  <features>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <gic supported='no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <vmcoreinfo supported='yes'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <genid supported='yes'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <backingStoreInput supported='yes'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <backup supported='yes'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <async-teardown supported='yes'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <ps2 supported='yes'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <sev supported='no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <sgx supported='no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <hyperv supported='yes'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='features'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>relaxed</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>vapic</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>spinlocks</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>vpindex</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>runtime</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>synic</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>stimer</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>reset</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>vendor_id</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>frequencies</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>reenlightenment</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>tlbflush</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>ipi</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>avic</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>emsr_bitmap</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>xmm_input</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <defaults>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <spinlocks>4095</spinlocks>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <stimer_direct>on</stimer_direct>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <tlbflush_direct>on</tlbflush_direct>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <tlbflush_extended>on</tlbflush_extended>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <vendor_id>Linux KVM Hv</vendor_id>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </defaults>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    </hyperv>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <launchSecurity supported='yes'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='sectype'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>tdx</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    </launchSecurity>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:  </features>
Dec  6 05:00:31 np0005548916 nova_compute[227597]: </domainCapabilities>
Dec  6 05:00:31 np0005548916 nova_compute[227597]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Dec  6 05:00:31 np0005548916 nova_compute[227597]: 2025-12-06 10:00:31.713 227601 DEBUG nova.virt.libvirt.host [None req-cea1e880-d63b-4a0b-8d20-816526471ae0 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Dec  6 05:00:31 np0005548916 nova_compute[227597]: 2025-12-06 10:00:31.718 227601 DEBUG nova.virt.libvirt.host [None req-cea1e880-d63b-4a0b-8d20-816526471ae0 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Dec  6 05:00:31 np0005548916 nova_compute[227597]: <domainCapabilities>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:  <path>/usr/libexec/qemu-kvm</path>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:  <domain>kvm</domain>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:  <machine>pc-i440fx-rhel7.6.0</machine>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:  <arch>x86_64</arch>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:  <vcpu max='240'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:  <iothreads supported='yes'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:  <os supported='yes'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <enum name='firmware'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <loader supported='yes'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='type'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>rom</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>pflash</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='readonly'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>yes</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>no</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='secure'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>no</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    </loader>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:  </os>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:  <cpu>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <mode name='host-passthrough' supported='yes'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='hostPassthroughMigratable'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>on</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>off</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    </mode>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <mode name='maximum' supported='yes'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='maximumMigratable'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>on</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>off</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    </mode>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <mode name='host-model' supported='yes'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model fallback='forbid'>EPYC-Rome</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <vendor>AMD</vendor>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <maxphysaddr mode='passthrough' limit='40'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <feature policy='require' name='x2apic'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <feature policy='require' name='tsc-deadline'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <feature policy='require' name='hypervisor'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <feature policy='require' name='tsc_adjust'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <feature policy='require' name='spec-ctrl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <feature policy='require' name='stibp'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <feature policy='require' name='ssbd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <feature policy='require' name='cmp_legacy'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <feature policy='require' name='overflow-recov'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <feature policy='require' name='succor'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <feature policy='require' name='ibrs'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <feature policy='require' name='amd-ssbd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <feature policy='require' name='virt-ssbd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <feature policy='require' name='lbrv'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <feature policy='require' name='tsc-scale'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <feature policy='require' name='vmcb-clean'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <feature policy='require' name='flushbyasid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <feature policy='require' name='pause-filter'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <feature policy='require' name='pfthreshold'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <feature policy='require' name='svme-addr-chk'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <feature policy='require' name='lfence-always-serializing'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <feature policy='disable' name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    </mode>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <mode name='custom' supported='yes'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Broadwell'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Broadwell-IBRS'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Broadwell-noTSX'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Broadwell-noTSX-IBRS'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Broadwell-v1'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Broadwell-v2'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Broadwell-v3'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Broadwell-v4'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Cascadelake-Server'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Cascadelake-Server-noTSX'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Cascadelake-Server-v1'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Cascadelake-Server-v2'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Cascadelake-Server-v3'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Cascadelake-Server-v4'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Cascadelake-Server-v5'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Cooperlake'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-bf16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Cooperlake-v1'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-bf16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Cooperlake-v2'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-bf16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Denverton'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='mpx'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Denverton-v1'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='mpx'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Denverton-v2'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Denverton-v3'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Dhyana-v2'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='EPYC-Genoa'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='amd-psfd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='auto-ibrs'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-bf16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512ifma'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='no-nested-data-bp'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='null-sel-clr-base'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='stibp-always-on'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='EPYC-Genoa-v1'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='amd-psfd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='auto-ibrs'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-bf16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512ifma'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='no-nested-data-bp'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='null-sel-clr-base'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='stibp-always-on'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='EPYC-Milan'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='EPYC-Milan-v1'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='EPYC-Milan-v2'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='amd-psfd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='no-nested-data-bp'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='null-sel-clr-base'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='stibp-always-on'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='EPYC-Rome'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='EPYC-Rome-v1'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='EPYC-Rome-v2'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='EPYC-Rome-v3'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='EPYC-v3'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='EPYC-v4'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='GraniteRapids'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='amx-bf16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='amx-fp16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='amx-int8'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='amx-tile'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx-vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-bf16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-fp16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512ifma'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fbsdp-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrc'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrs'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fzrm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='mcdt-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pbrsb-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='prefetchiti'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='psdp-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='sbdr-ssdp-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='serialize'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='tsx-ldtrk'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xfd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='GraniteRapids-v1'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='amx-bf16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='amx-fp16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='amx-int8'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='amx-tile'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx-vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-bf16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-fp16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512ifma'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fbsdp-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrc'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrs'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fzrm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='mcdt-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pbrsb-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='prefetchiti'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='psdp-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='sbdr-ssdp-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='serialize'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='tsx-ldtrk'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xfd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='GraniteRapids-v2'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='amx-bf16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='amx-fp16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='amx-int8'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='amx-tile'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx-vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx10'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx10-128'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx10-256'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx10-512'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-bf16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-fp16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512ifma'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='cldemote'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fbsdp-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrc'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrs'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fzrm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='mcdt-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='movdir64b'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='movdiri'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pbrsb-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='prefetchiti'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='psdp-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='sbdr-ssdp-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='serialize'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ss'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='tsx-ldtrk'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xfd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Haswell'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Haswell-IBRS'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Haswell-noTSX'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Haswell-noTSX-IBRS'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Haswell-v1'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Haswell-v2'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Haswell-v3'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Haswell-v4'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Icelake-Server'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Icelake-Server-noTSX'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Icelake-Server-v1'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Icelake-Server-v2'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Icelake-Server-v3'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Icelake-Server-v4'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512ifma'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Icelake-Server-v5'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512ifma'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Icelake-Server-v6'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512ifma'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Icelake-Server-v7'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512ifma'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='IvyBridge'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='IvyBridge-IBRS'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='IvyBridge-v1'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='IvyBridge-v2'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='KnightsMill'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-4fmaps'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-4vnniw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512er'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512pf'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ss'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='KnightsMill-v1'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-4fmaps'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-4vnniw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512er'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512pf'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ss'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Opteron_G4'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fma4'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xop'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Opteron_G4-v1'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fma4'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xop'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Opteron_G5'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fma4'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='tbm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xop'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Opteron_G5-v1'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fma4'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='tbm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xop'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='SapphireRapids'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='amx-bf16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='amx-int8'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='amx-tile'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx-vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-bf16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-fp16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512ifma'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrc'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrs'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fzrm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='serialize'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='tsx-ldtrk'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xfd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='SapphireRapids-v1'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='amx-bf16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='amx-int8'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='amx-tile'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx-vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-bf16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-fp16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512ifma'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrc'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrs'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fzrm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='serialize'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='tsx-ldtrk'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xfd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='SapphireRapids-v2'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='amx-bf16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='amx-int8'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='amx-tile'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx-vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-bf16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-fp16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512ifma'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fbsdp-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrc'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrs'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fzrm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='psdp-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='sbdr-ssdp-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='serialize'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='tsx-ldtrk'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xfd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='SapphireRapids-v3'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='amx-bf16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='amx-int8'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='amx-tile'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx-vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-bf16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-fp16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512ifma'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='cldemote'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fbsdp-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrc'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrs'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fzrm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='movdir64b'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='movdiri'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='psdp-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='sbdr-ssdp-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='serialize'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ss'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='tsx-ldtrk'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xfd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='SierraForest'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx-ifma'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx-ne-convert'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx-vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx-vnni-int8'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='cmpccxadd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fbsdp-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrs'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='mcdt-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pbrsb-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='psdp-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='sbdr-ssdp-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='serialize'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='SierraForest-v1'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx-ifma'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx-ne-convert'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx-vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx-vnni-int8'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='cmpccxadd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fbsdp-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrs'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='mcdt-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pbrsb-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='psdp-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='sbdr-ssdp-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='serialize'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Skylake-Client'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Skylake-Client-IBRS'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Skylake-Client-v1'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Skylake-Client-v2'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Skylake-Client-v3'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Skylake-Client-v4'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Skylake-Server'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Skylake-Server-IBRS'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Skylake-Server-v1'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Skylake-Server-v2'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Skylake-Server-v3'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Skylake-Server-v4'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Skylake-Server-v5'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Snowridge'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='cldemote'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='core-capability'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='movdir64b'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='movdiri'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='mpx'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='split-lock-detect'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Snowridge-v1'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='cldemote'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='core-capability'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='movdir64b'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='movdiri'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='mpx'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='split-lock-detect'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Snowridge-v2'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='cldemote'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='core-capability'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='movdir64b'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='movdiri'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='split-lock-detect'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Snowridge-v3'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='cldemote'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='core-capability'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='movdir64b'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='movdiri'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='split-lock-detect'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Snowridge-v4'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='cldemote'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='movdir64b'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='movdiri'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='athlon'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='3dnow'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='3dnowext'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='athlon-v1'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='3dnow'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='3dnowext'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='core2duo'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ss'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='core2duo-v1'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ss'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='coreduo'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ss'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='coreduo-v1'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ss'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='n270'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ss'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='n270-v1'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ss'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='phenom'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='3dnow'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='3dnowext'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='phenom-v1'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='3dnow'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='3dnowext'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    </mode>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:  </cpu>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:  <memoryBacking supported='yes'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <enum name='sourceType'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <value>file</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <value>anonymous</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <value>memfd</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:  </memoryBacking>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:  <devices>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <disk supported='yes'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='diskDevice'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>disk</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>cdrom</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>floppy</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>lun</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='bus'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>ide</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>fdc</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>scsi</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>virtio</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>usb</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>sata</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='model'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>virtio</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>virtio-transitional</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>virtio-non-transitional</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    </disk>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <graphics supported='yes'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='type'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>vnc</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>egl-headless</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>dbus</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    </graphics>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <video supported='yes'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='modelType'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>vga</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>cirrus</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>virtio</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>none</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>bochs</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>ramfb</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    </video>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <hostdev supported='yes'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='mode'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>subsystem</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='startupPolicy'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>default</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>mandatory</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>requisite</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>optional</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='subsysType'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>usb</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>pci</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>scsi</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='capsType'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='pciBackend'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    </hostdev>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <rng supported='yes'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='model'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>virtio</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>virtio-transitional</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>virtio-non-transitional</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='backendModel'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>random</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>egd</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>builtin</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    </rng>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <filesystem supported='yes'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='driverType'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>path</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>handle</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>virtiofs</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    </filesystem>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <tpm supported='yes'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='model'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>tpm-tis</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>tpm-crb</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='backendModel'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>emulator</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>external</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='backendVersion'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>2.0</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    </tpm>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <redirdev supported='yes'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='bus'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>usb</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    </redirdev>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <channel supported='yes'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='type'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>pty</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>unix</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    </channel>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <crypto supported='yes'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='model'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='type'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>qemu</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='backendModel'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>builtin</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    </crypto>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <interface supported='yes'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='backendType'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>default</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>passt</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    </interface>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <panic supported='yes'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='model'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>isa</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>hyperv</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    </panic>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <console supported='yes'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='type'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>null</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>vc</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>pty</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>dev</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>file</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>pipe</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>stdio</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>udp</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>tcp</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>unix</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>qemu-vdagent</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>dbus</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    </console>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:  </devices>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:  <features>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <gic supported='no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <vmcoreinfo supported='yes'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <genid supported='yes'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <backingStoreInput supported='yes'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <backup supported='yes'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <async-teardown supported='yes'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <ps2 supported='yes'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <sev supported='no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <sgx supported='no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <hyperv supported='yes'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='features'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>relaxed</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>vapic</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>spinlocks</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>vpindex</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>runtime</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>synic</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>stimer</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>reset</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>vendor_id</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>frequencies</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>reenlightenment</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>tlbflush</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>ipi</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>avic</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>emsr_bitmap</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>xmm_input</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <defaults>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <spinlocks>4095</spinlocks>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <stimer_direct>on</stimer_direct>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <tlbflush_direct>on</tlbflush_direct>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <tlbflush_extended>on</tlbflush_extended>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <vendor_id>Linux KVM Hv</vendor_id>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </defaults>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    </hyperv>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <launchSecurity supported='yes'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='sectype'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>tdx</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    </launchSecurity>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:  </features>
Dec  6 05:00:31 np0005548916 nova_compute[227597]: </domainCapabilities>
Dec  6 05:00:31 np0005548916 nova_compute[227597]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Dec  6 05:00:31 np0005548916 nova_compute[227597]: 2025-12-06 10:00:31.782 227601 DEBUG nova.virt.libvirt.host [None req-cea1e880-d63b-4a0b-8d20-816526471ae0 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Dec  6 05:00:31 np0005548916 nova_compute[227597]: <domainCapabilities>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:  <path>/usr/libexec/qemu-kvm</path>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:  <domain>kvm</domain>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:  <machine>pc-q35-rhel9.8.0</machine>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:  <arch>x86_64</arch>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:  <vcpu max='4096'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:  <iothreads supported='yes'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:  <os supported='yes'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <enum name='firmware'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <value>efi</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <loader supported='yes'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='type'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>rom</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>pflash</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='readonly'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>yes</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>no</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='secure'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>yes</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>no</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    </loader>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:  </os>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:  <cpu>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <mode name='host-passthrough' supported='yes'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='hostPassthroughMigratable'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>on</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>off</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    </mode>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <mode name='maximum' supported='yes'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='maximumMigratable'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>on</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>off</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    </mode>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <mode name='host-model' supported='yes'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model fallback='forbid'>EPYC-Rome</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <vendor>AMD</vendor>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <maxphysaddr mode='passthrough' limit='40'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <feature policy='require' name='x2apic'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <feature policy='require' name='tsc-deadline'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <feature policy='require' name='hypervisor'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <feature policy='require' name='tsc_adjust'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <feature policy='require' name='spec-ctrl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <feature policy='require' name='stibp'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <feature policy='require' name='ssbd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <feature policy='require' name='cmp_legacy'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <feature policy='require' name='overflow-recov'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <feature policy='require' name='succor'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <feature policy='require' name='ibrs'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <feature policy='require' name='amd-ssbd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <feature policy='require' name='virt-ssbd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <feature policy='require' name='lbrv'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <feature policy='require' name='tsc-scale'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <feature policy='require' name='vmcb-clean'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <feature policy='require' name='flushbyasid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <feature policy='require' name='pause-filter'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <feature policy='require' name='pfthreshold'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <feature policy='require' name='svme-addr-chk'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <feature policy='require' name='lfence-always-serializing'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <feature policy='disable' name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    </mode>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <mode name='custom' supported='yes'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Broadwell'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Broadwell-IBRS'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Broadwell-noTSX'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Broadwell-noTSX-IBRS'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Broadwell-v1'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Broadwell-v2'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Broadwell-v3'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Broadwell-v4'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Cascadelake-Server'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Cascadelake-Server-noTSX'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Cascadelake-Server-v1'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Cascadelake-Server-v2'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Cascadelake-Server-v3'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Cascadelake-Server-v4'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Cascadelake-Server-v5'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Cooperlake'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-bf16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Cooperlake-v1'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-bf16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Cooperlake-v2'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-bf16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Denverton'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='mpx'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Denverton-v1'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='mpx'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Denverton-v2'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Denverton-v3'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Dhyana-v2'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='EPYC-Genoa'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='amd-psfd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='auto-ibrs'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-bf16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512ifma'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='no-nested-data-bp'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='null-sel-clr-base'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='stibp-always-on'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='EPYC-Genoa-v1'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='amd-psfd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='auto-ibrs'/>
Dec  6 05:00:31 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:31 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4b8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-bf16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512ifma'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='no-nested-data-bp'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='null-sel-clr-base'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='stibp-always-on'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='EPYC-Milan'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='EPYC-Milan-v1'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='EPYC-Milan-v2'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='amd-psfd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='no-nested-data-bp'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='null-sel-clr-base'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='stibp-always-on'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='EPYC-Rome'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='EPYC-Rome-v1'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='EPYC-Rome-v2'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='EPYC-Rome-v3'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='EPYC-v3'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='EPYC-v4'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='GraniteRapids'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='amx-bf16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='amx-fp16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='amx-int8'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='amx-tile'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx-vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-bf16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-fp16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512ifma'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fbsdp-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrc'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrs'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fzrm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='mcdt-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pbrsb-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='prefetchiti'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='psdp-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='sbdr-ssdp-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='serialize'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='tsx-ldtrk'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xfd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='GraniteRapids-v1'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='amx-bf16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='amx-fp16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='amx-int8'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='amx-tile'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx-vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-bf16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-fp16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512ifma'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fbsdp-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrc'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrs'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fzrm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='mcdt-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pbrsb-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='prefetchiti'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='psdp-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='sbdr-ssdp-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='serialize'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='tsx-ldtrk'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xfd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='GraniteRapids-v2'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='amx-bf16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='amx-fp16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='amx-int8'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='amx-tile'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx-vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx10'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx10-128'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx10-256'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx10-512'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-bf16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-fp16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512ifma'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='cldemote'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fbsdp-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrc'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrs'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fzrm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='mcdt-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='movdir64b'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='movdiri'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pbrsb-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='prefetchiti'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='psdp-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='sbdr-ssdp-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='serialize'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ss'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='tsx-ldtrk'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xfd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Haswell'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Haswell-IBRS'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Haswell-noTSX'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Haswell-noTSX-IBRS'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Haswell-v1'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Haswell-v2'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Haswell-v3'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Haswell-v4'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Icelake-Server'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Icelake-Server-noTSX'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Icelake-Server-v1'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Icelake-Server-v2'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Icelake-Server-v3'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Icelake-Server-v4'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512ifma'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Icelake-Server-v5'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512ifma'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Icelake-Server-v6'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512ifma'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Icelake-Server-v7'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512ifma'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='IvyBridge'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='IvyBridge-IBRS'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='IvyBridge-v1'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='IvyBridge-v2'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='KnightsMill'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-4fmaps'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-4vnniw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512er'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512pf'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ss'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='KnightsMill-v1'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-4fmaps'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-4vnniw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512er'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512pf'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ss'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Opteron_G4'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fma4'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xop'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Opteron_G4-v1'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fma4'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xop'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Opteron_G5'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fma4'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='tbm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xop'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Opteron_G5-v1'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fma4'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='tbm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xop'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='SapphireRapids'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='amx-bf16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='amx-int8'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='amx-tile'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx-vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-bf16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-fp16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512ifma'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrc'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrs'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fzrm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='serialize'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='tsx-ldtrk'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xfd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='SapphireRapids-v1'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='amx-bf16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='amx-int8'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='amx-tile'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx-vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-bf16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-fp16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512ifma'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrc'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrs'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fzrm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='serialize'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='tsx-ldtrk'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xfd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='SapphireRapids-v2'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='amx-bf16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='amx-int8'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='amx-tile'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx-vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-bf16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-fp16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512ifma'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fbsdp-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrc'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrs'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fzrm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='psdp-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='sbdr-ssdp-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='serialize'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='tsx-ldtrk'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xfd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='SapphireRapids-v3'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='amx-bf16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='amx-int8'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='amx-tile'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx-vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-bf16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-fp16'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512ifma'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='cldemote'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fbsdp-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrc'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrs'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fzrm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='movdir64b'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='movdiri'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='psdp-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='sbdr-ssdp-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='serialize'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ss'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='tsx-ldtrk'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xfd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='SierraForest'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx-ifma'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx-ne-convert'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx-vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx-vnni-int8'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='cmpccxadd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fbsdp-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrs'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='mcdt-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pbrsb-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='psdp-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='sbdr-ssdp-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='serialize'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='SierraForest-v1'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx-ifma'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx-ne-convert'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx-vnni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx-vnni-int8'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='cmpccxadd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fbsdp-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='fsrs'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='mcdt-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pbrsb-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='psdp-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='sbdr-ssdp-no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='serialize'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Skylake-Client'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Skylake-Client-IBRS'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Skylake-Client-v1'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Skylake-Client-v2'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Skylake-Client-v3'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Skylake-Client-v4'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Skylake-Server'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Skylake-Server-IBRS'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Skylake-Server-v1'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Skylake-Server-v2'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Skylake-Server-v3'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Skylake-Server-v4'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Skylake-Server-v5'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Snowridge'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='cldemote'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='core-capability'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='movdir64b'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='movdiri'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='mpx'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='split-lock-detect'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Snowridge-v1'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='cldemote'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='core-capability'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='movdir64b'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='movdiri'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='mpx'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='split-lock-detect'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Snowridge-v2'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='cldemote'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='core-capability'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='movdir64b'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='movdiri'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='split-lock-detect'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Snowridge-v3'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='cldemote'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='core-capability'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='movdir64b'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='movdiri'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='split-lock-detect'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='Snowridge-v4'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='cldemote'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='movdir64b'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='movdiri'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='athlon'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='3dnow'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='3dnowext'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='athlon-v1'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='3dnow'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='3dnowext'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='core2duo'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ss'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='core2duo-v1'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ss'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='coreduo'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ss'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='coreduo-v1'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ss'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='n270'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ss'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='n270-v1'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='ss'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='phenom'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='3dnow'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='3dnowext'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <blockers model='phenom-v1'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='3dnow'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <feature name='3dnowext'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </blockers>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    </mode>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:  </cpu>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:  <memoryBacking supported='yes'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <enum name='sourceType'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <value>file</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <value>anonymous</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <value>memfd</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:  </memoryBacking>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:  <devices>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <disk supported='yes'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='diskDevice'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>disk</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>cdrom</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>floppy</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>lun</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='bus'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>fdc</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>scsi</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>virtio</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>usb</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>sata</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='model'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>virtio</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>virtio-transitional</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>virtio-non-transitional</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    </disk>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <graphics supported='yes'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='type'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>vnc</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>egl-headless</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>dbus</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    </graphics>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <video supported='yes'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='modelType'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>vga</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>cirrus</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>virtio</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>none</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>bochs</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>ramfb</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    </video>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <hostdev supported='yes'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='mode'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>subsystem</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='startupPolicy'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>default</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>mandatory</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>requisite</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>optional</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='subsysType'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>usb</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>pci</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>scsi</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='capsType'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='pciBackend'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    </hostdev>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <rng supported='yes'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='model'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>virtio</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>virtio-transitional</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>virtio-non-transitional</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='backendModel'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>random</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>egd</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>builtin</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    </rng>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <filesystem supported='yes'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='driverType'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>path</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>handle</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>virtiofs</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    </filesystem>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <tpm supported='yes'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='model'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>tpm-tis</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>tpm-crb</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='backendModel'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>emulator</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>external</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='backendVersion'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>2.0</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    </tpm>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <redirdev supported='yes'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='bus'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>usb</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    </redirdev>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <channel supported='yes'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='type'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>pty</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>unix</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    </channel>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <crypto supported='yes'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='model'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='type'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>qemu</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='backendModel'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>builtin</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    </crypto>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <interface supported='yes'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='backendType'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>default</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>passt</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    </interface>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <panic supported='yes'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='model'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>isa</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>hyperv</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    </panic>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <console supported='yes'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='type'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>null</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>vc</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>pty</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>dev</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>file</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>pipe</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>stdio</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>udp</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>tcp</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>unix</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>qemu-vdagent</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>dbus</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    </console>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:  </devices>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:  <features>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <gic supported='no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <vmcoreinfo supported='yes'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <genid supported='yes'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <backingStoreInput supported='yes'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <backup supported='yes'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <async-teardown supported='yes'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <ps2 supported='yes'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <sev supported='no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <sgx supported='no'/>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <hyperv supported='yes'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='features'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>relaxed</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>vapic</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>spinlocks</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>vpindex</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>runtime</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>synic</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>stimer</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>reset</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>vendor_id</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>frequencies</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>reenlightenment</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>tlbflush</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>ipi</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>avic</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>emsr_bitmap</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>xmm_input</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <defaults>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <spinlocks>4095</spinlocks>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <stimer_direct>on</stimer_direct>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <tlbflush_direct>on</tlbflush_direct>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <tlbflush_extended>on</tlbflush_extended>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <vendor_id>Linux KVM Hv</vendor_id>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </defaults>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    </hyperv>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    <launchSecurity supported='yes'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      <enum name='sectype'>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:        <value>tdx</value>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:      </enum>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:    </launchSecurity>
Dec  6 05:00:31 np0005548916 nova_compute[227597]:  </features>
Dec  6 05:00:31 np0005548916 nova_compute[227597]: </domainCapabilities>
Dec  6 05:00:31 np0005548916 nova_compute[227597]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Dec  6 05:00:31 np0005548916 nova_compute[227597]: 2025-12-06 10:00:31.855 227601 DEBUG nova.virt.libvirt.host [None req-cea1e880-d63b-4a0b-8d20-816526471ae0 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Dec  6 05:00:31 np0005548916 nova_compute[227597]: 2025-12-06 10:00:31.855 227601 DEBUG nova.virt.libvirt.host [None req-cea1e880-d63b-4a0b-8d20-816526471ae0 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Dec  6 05:00:31 np0005548916 nova_compute[227597]: 2025-12-06 10:00:31.856 227601 DEBUG nova.virt.libvirt.host [None req-cea1e880-d63b-4a0b-8d20-816526471ae0 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Dec  6 05:00:31 np0005548916 nova_compute[227597]: 2025-12-06 10:00:31.856 227601 INFO nova.virt.libvirt.host [None req-cea1e880-d63b-4a0b-8d20-816526471ae0 - - - - - -] Secure Boot support detected#033[00m
Dec  6 05:00:31 np0005548916 nova_compute[227597]: 2025-12-06 10:00:31.867 227601 INFO nova.virt.libvirt.driver [None req-cea1e880-d63b-4a0b-8d20-816526471ae0 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Dec  6 05:00:31 np0005548916 nova_compute[227597]: 2025-12-06 10:00:31.867 227601 INFO nova.virt.libvirt.driver [None req-cea1e880-d63b-4a0b-8d20-816526471ae0 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Dec  6 05:00:31 np0005548916 nova_compute[227597]: 2025-12-06 10:00:31.885 227601 DEBUG nova.virt.libvirt.driver [None req-cea1e880-d63b-4a0b-8d20-816526471ae0 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097#033[00m
Dec  6 05:00:31 np0005548916 nova_compute[227597]: 2025-12-06 10:00:31.907 227601 INFO nova.virt.node [None req-cea1e880-d63b-4a0b-8d20-816526471ae0 - - - - - -] Determined node identity ff2f17cb-ff1d-4da7-9560-4be741380cb1 from /var/lib/nova/compute_id#033[00m
Dec  6 05:00:31 np0005548916 nova_compute[227597]: 2025-12-06 10:00:31.919 227601 WARNING nova.compute.manager [None req-cea1e880-d63b-4a0b-8d20-816526471ae0 - - - - - -] Compute nodes ['ff2f17cb-ff1d-4da7-9560-4be741380cb1'] for host compute-1.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.#033[00m
Dec  6 05:00:31 np0005548916 nova_compute[227597]: 2025-12-06 10:00:31.940 227601 INFO nova.compute.manager [None req-cea1e880-d63b-4a0b-8d20-816526471ae0 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host#033[00m
Dec  6 05:00:31 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:31 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4b8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:00:31 np0005548916 nova_compute[227597]: 2025-12-06 10:00:31.965 227601 WARNING nova.compute.manager [None req-cea1e880-d63b-4a0b-8d20-816526471ae0 - - - - - -] No compute node record found for host compute-1.ctlplane.example.com. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-1.ctlplane.example.com could not be found.#033[00m
Dec  6 05:00:31 np0005548916 nova_compute[227597]: 2025-12-06 10:00:31.965 227601 DEBUG oslo_concurrency.lockutils [None req-cea1e880-d63b-4a0b-8d20-816526471ae0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:00:31 np0005548916 nova_compute[227597]: 2025-12-06 10:00:31.966 227601 DEBUG oslo_concurrency.lockutils [None req-cea1e880-d63b-4a0b-8d20-816526471ae0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:00:31 np0005548916 nova_compute[227597]: 2025-12-06 10:00:31.966 227601 DEBUG oslo_concurrency.lockutils [None req-cea1e880-d63b-4a0b-8d20-816526471ae0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:00:31 np0005548916 nova_compute[227597]: 2025-12-06 10:00:31.966 227601 DEBUG nova.compute.resource_tracker [None req-cea1e880-d63b-4a0b-8d20-816526471ae0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  6 05:00:31 np0005548916 nova_compute[227597]: 2025-12-06 10:00:31.967 227601 DEBUG oslo_concurrency.processutils [None req-cea1e880-d63b-4a0b-8d20-816526471ae0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:00:32 np0005548916 python3.9[228498]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  6 05:00:32 np0005548916 systemd[1]: Stopping nova_compute container...
Dec  6 05:00:32 np0005548916 nova_compute[227597]: 2025-12-06 10:00:32.312 227601 DEBUG oslo_concurrency.lockutils [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  6 05:00:32 np0005548916 nova_compute[227597]: 2025-12-06 10:00:32.312 227601 DEBUG oslo_concurrency.lockutils [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  6 05:00:32 np0005548916 nova_compute[227597]: 2025-12-06 10:00:32.313 227601 DEBUG oslo_concurrency.lockutils [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  6 05:00:32 np0005548916 virtqemud[228188]: libvirt version: 11.9.0, package: 1.el9 (builder@centos.org, 2025-11-04-09:54:50, )
Dec  6 05:00:32 np0005548916 virtqemud[228188]: hostname: compute-1
Dec  6 05:00:32 np0005548916 virtqemud[228188]: End of file while reading data: Input/output error
Dec  6 05:00:32 np0005548916 systemd[1]: libpod-fb9b5b61d2ca2446a0d64f5cf1f50538f7d4513378afdaeab81ced8fe95510ba.scope: Deactivated successfully.
Dec  6 05:00:32 np0005548916 systemd[1]: libpod-fb9b5b61d2ca2446a0d64f5cf1f50538f7d4513378afdaeab81ced8fe95510ba.scope: Consumed 4.076s CPU time.
Dec  6 05:00:32 np0005548916 podman[228522]: 2025-12-06 10:00:32.930521627 +0000 UTC m=+0.669463731 container died fb9b5b61d2ca2446a0d64f5cf1f50538f7d4513378afdaeab81ced8fe95510ba (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5, name=nova_compute, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=edpm)
Dec  6 05:00:32 np0005548916 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-fb9b5b61d2ca2446a0d64f5cf1f50538f7d4513378afdaeab81ced8fe95510ba-userdata-shm.mount: Deactivated successfully.
Dec  6 05:00:32 np0005548916 systemd[1]: var-lib-containers-storage-overlay-fe42b0bad23ec200123940026b03e1add6e1efb6ab3f7ea1cab6155db02ec0c3-merged.mount: Deactivated successfully.
Dec  6 05:00:33 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:33 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4bc003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:00:33 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:00:33 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:00:33 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:00:33.493 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:00:33 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:00:33 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:00:33 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:00:33.729 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:00:33 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:00:33 np0005548916 podman[228522]: 2025-12-06 10:00:33.781511201 +0000 UTC m=+1.520453275 container cleanup fb9b5b61d2ca2446a0d64f5cf1f50538f7d4513378afdaeab81ced8fe95510ba (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5, name=nova_compute, tcib_managed=true, container_name=nova_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec  6 05:00:33 np0005548916 podman[228522]: nova_compute
Dec  6 05:00:33 np0005548916 podman[228547]: nova_compute
Dec  6 05:00:33 np0005548916 systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Dec  6 05:00:33 np0005548916 systemd[1]: Stopped nova_compute container.
Dec  6 05:00:33 np0005548916 systemd[1]: Starting nova_compute container...
Dec  6 05:00:33 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:33 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4dc0029b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:00:33 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:33 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4c40041f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:00:33 np0005548916 systemd[1]: Started libcrun container.
Dec  6 05:00:33 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fe42b0bad23ec200123940026b03e1add6e1efb6ab3f7ea1cab6155db02ec0c3/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Dec  6 05:00:33 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fe42b0bad23ec200123940026b03e1add6e1efb6ab3f7ea1cab6155db02ec0c3/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Dec  6 05:00:33 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fe42b0bad23ec200123940026b03e1add6e1efb6ab3f7ea1cab6155db02ec0c3/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Dec  6 05:00:33 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fe42b0bad23ec200123940026b03e1add6e1efb6ab3f7ea1cab6155db02ec0c3/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Dec  6 05:00:33 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fe42b0bad23ec200123940026b03e1add6e1efb6ab3f7ea1cab6155db02ec0c3/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Dec  6 05:00:33 np0005548916 podman[228560]: 2025-12-06 10:00:33.991919396 +0000 UTC m=+0.108064673 container init fb9b5b61d2ca2446a0d64f5cf1f50538f7d4513378afdaeab81ced8fe95510ba (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5, name=nova_compute, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=edpm)
Dec  6 05:00:34 np0005548916 podman[228560]: 2025-12-06 10:00:34.005508885 +0000 UTC m=+0.121654112 container start fb9b5b61d2ca2446a0d64f5cf1f50538f7d4513378afdaeab81ced8fe95510ba (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5, name=nova_compute, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Dec  6 05:00:34 np0005548916 podman[228560]: nova_compute
Dec  6 05:00:34 np0005548916 nova_compute[228576]: + sudo -E kolla_set_configs
Dec  6 05:00:34 np0005548916 systemd[1]: Started nova_compute container.
Dec  6 05:00:34 np0005548916 nova_compute[228576]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec  6 05:00:34 np0005548916 nova_compute[228576]: INFO:__main__:Validating config file
Dec  6 05:00:34 np0005548916 nova_compute[228576]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec  6 05:00:34 np0005548916 nova_compute[228576]: INFO:__main__:Copying service configuration files
Dec  6 05:00:34 np0005548916 nova_compute[228576]: INFO:__main__:Deleting /etc/nova/nova.conf
Dec  6 05:00:34 np0005548916 nova_compute[228576]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Dec  6 05:00:34 np0005548916 nova_compute[228576]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Dec  6 05:00:34 np0005548916 nova_compute[228576]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Dec  6 05:00:34 np0005548916 nova_compute[228576]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Dec  6 05:00:34 np0005548916 nova_compute[228576]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Dec  6 05:00:34 np0005548916 nova_compute[228576]: INFO:__main__:Deleting /etc/nova/nova.conf.d/03-ceph-nova.conf
Dec  6 05:00:34 np0005548916 nova_compute[228576]: INFO:__main__:Copying /var/lib/kolla/config_files/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf
Dec  6 05:00:34 np0005548916 nova_compute[228576]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf
Dec  6 05:00:34 np0005548916 nova_compute[228576]: INFO:__main__:Deleting /etc/nova/nova.conf.d/25-nova-extra.conf
Dec  6 05:00:34 np0005548916 nova_compute[228576]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Dec  6 05:00:34 np0005548916 nova_compute[228576]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Dec  6 05:00:34 np0005548916 nova_compute[228576]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Dec  6 05:00:34 np0005548916 nova_compute[228576]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Dec  6 05:00:34 np0005548916 nova_compute[228576]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Dec  6 05:00:34 np0005548916 nova_compute[228576]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec  6 05:00:34 np0005548916 nova_compute[228576]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec  6 05:00:34 np0005548916 nova_compute[228576]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec  6 05:00:34 np0005548916 nova_compute[228576]: INFO:__main__:Deleting /etc/ceph
Dec  6 05:00:34 np0005548916 nova_compute[228576]: INFO:__main__:Creating directory /etc/ceph
Dec  6 05:00:34 np0005548916 nova_compute[228576]: INFO:__main__:Setting permission for /etc/ceph
Dec  6 05:00:34 np0005548916 nova_compute[228576]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring
Dec  6 05:00:34 np0005548916 nova_compute[228576]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Dec  6 05:00:34 np0005548916 nova_compute[228576]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.conf to /etc/ceph/ceph.conf
Dec  6 05:00:34 np0005548916 nova_compute[228576]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Dec  6 05:00:34 np0005548916 nova_compute[228576]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Dec  6 05:00:34 np0005548916 nova_compute[228576]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Dec  6 05:00:34 np0005548916 nova_compute[228576]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Dec  6 05:00:34 np0005548916 nova_compute[228576]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Dec  6 05:00:34 np0005548916 nova_compute[228576]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Dec  6 05:00:34 np0005548916 nova_compute[228576]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Dec  6 05:00:34 np0005548916 nova_compute[228576]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Dec  6 05:00:34 np0005548916 nova_compute[228576]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Dec  6 05:00:34 np0005548916 nova_compute[228576]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Dec  6 05:00:34 np0005548916 nova_compute[228576]: INFO:__main__:Writing out command to execute
Dec  6 05:00:34 np0005548916 nova_compute[228576]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Dec  6 05:00:34 np0005548916 nova_compute[228576]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Dec  6 05:00:34 np0005548916 nova_compute[228576]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Dec  6 05:00:34 np0005548916 nova_compute[228576]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Dec  6 05:00:34 np0005548916 nova_compute[228576]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Dec  6 05:00:34 np0005548916 nova_compute[228576]: ++ cat /run_command
Dec  6 05:00:34 np0005548916 nova_compute[228576]: + CMD=nova-compute
Dec  6 05:00:34 np0005548916 nova_compute[228576]: + ARGS=
Dec  6 05:00:34 np0005548916 nova_compute[228576]: + sudo kolla_copy_cacerts
Dec  6 05:00:34 np0005548916 nova_compute[228576]: + [[ ! -n '' ]]
Dec  6 05:00:34 np0005548916 nova_compute[228576]: + . kolla_extend_start
Dec  6 05:00:34 np0005548916 nova_compute[228576]: Running command: 'nova-compute'
Dec  6 05:00:34 np0005548916 nova_compute[228576]: + echo 'Running command: '\''nova-compute'\'''
Dec  6 05:00:34 np0005548916 nova_compute[228576]: + umask 0022
Dec  6 05:00:34 np0005548916 nova_compute[228576]: + exec nova-compute
Dec  6 05:00:34 np0005548916 podman[228740]: 2025-12-06 10:00:34.973131488 +0000 UTC m=+0.090427744 container health_status ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, config_id=multipathd)
Dec  6 05:00:35 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:35 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4b8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:00:35 np0005548916 python3.9[228741]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Dec  6 05:00:35 np0005548916 systemd[1]: Started libpod-conmon-5870135d94bf035757c1e7c1605be69377e8a248cd872b4fc5e7fa9268e794c2.scope.
Dec  6 05:00:35 np0005548916 systemd[1]: Started libcrun container.
Dec  6 05:00:35 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/69877ef447b98042ebc90a4e4357f2beb92c0f4ec7b168baeca8fa88d64b6a81/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Dec  6 05:00:35 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/69877ef447b98042ebc90a4e4357f2beb92c0f4ec7b168baeca8fa88d64b6a81/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Dec  6 05:00:35 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/69877ef447b98042ebc90a4e4357f2beb92c0f4ec7b168baeca8fa88d64b6a81/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Dec  6 05:00:35 np0005548916 podman[228782]: 2025-12-06 10:00:35.410414495 +0000 UTC m=+0.135462066 container init 5870135d94bf035757c1e7c1605be69377e8a248cd872b4fc5e7fa9268e794c2 (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5, name=nova_compute_init, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']})
Dec  6 05:00:35 np0005548916 podman[228782]: 2025-12-06 10:00:35.419585618 +0000 UTC m=+0.144633199 container start 5870135d94bf035757c1e7c1605be69377e8a248cd872b4fc5e7fa9268e794c2 (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5, name=nova_compute_init, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=edpm, container_name=nova_compute_init, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec  6 05:00:35 np0005548916 python3.9[228741]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Dec  6 05:00:35 np0005548916 nova_compute_init[228804]: INFO:nova_statedir:Applying nova statedir ownership
Dec  6 05:00:35 np0005548916 nova_compute_init[228804]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Dec  6 05:00:35 np0005548916 nova_compute_init[228804]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Dec  6 05:00:35 np0005548916 nova_compute_init[228804]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Dec  6 05:00:35 np0005548916 nova_compute_init[228804]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Dec  6 05:00:35 np0005548916 nova_compute_init[228804]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Dec  6 05:00:35 np0005548916 nova_compute_init[228804]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Dec  6 05:00:35 np0005548916 nova_compute_init[228804]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Dec  6 05:00:35 np0005548916 nova_compute_init[228804]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Dec  6 05:00:35 np0005548916 nova_compute_init[228804]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Dec  6 05:00:35 np0005548916 nova_compute_init[228804]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Dec  6 05:00:35 np0005548916 nova_compute_init[228804]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Dec  6 05:00:35 np0005548916 nova_compute_init[228804]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Dec  6 05:00:35 np0005548916 nova_compute_init[228804]: INFO:nova_statedir:Nova statedir ownership complete
Dec  6 05:00:35 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:00:35 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:00:35 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:00:35.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:00:35 np0005548916 systemd[1]: libpod-5870135d94bf035757c1e7c1605be69377e8a248cd872b4fc5e7fa9268e794c2.scope: Deactivated successfully.
Dec  6 05:00:35 np0005548916 podman[228805]: 2025-12-06 10:00:35.508951036 +0000 UTC m=+0.044594303 container died 5870135d94bf035757c1e7c1605be69377e8a248cd872b4fc5e7fa9268e794c2 (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5, name=nova_compute_init, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, container_name=nova_compute_init, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  6 05:00:35 np0005548916 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5870135d94bf035757c1e7c1605be69377e8a248cd872b4fc5e7fa9268e794c2-userdata-shm.mount: Deactivated successfully.
Dec  6 05:00:35 np0005548916 systemd[1]: var-lib-containers-storage-overlay-69877ef447b98042ebc90a4e4357f2beb92c0f4ec7b168baeca8fa88d64b6a81-merged.mount: Deactivated successfully.
Dec  6 05:00:35 np0005548916 podman[228816]: 2025-12-06 10:00:35.585137544 +0000 UTC m=+0.067513339 container cleanup 5870135d94bf035757c1e7c1605be69377e8a248cd872b4fc5e7fa9268e794c2 (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5, name=nova_compute_init, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=nova_compute_init, io.buildah.version=1.41.3, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  6 05:00:35 np0005548916 systemd[1]: libpod-conmon-5870135d94bf035757c1e7c1605be69377e8a248cd872b4fc5e7fa9268e794c2.scope: Deactivated successfully.
Dec  6 05:00:35 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:00:35 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:00:35 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:00:35.732 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:00:35 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:35 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4bc003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:00:35 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:35 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4dc0029b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:00:36 np0005548916 systemd[1]: session-53.scope: Deactivated successfully.
Dec  6 05:00:36 np0005548916 systemd[1]: session-53.scope: Consumed 2min 25.291s CPU time.
Dec  6 05:00:36 np0005548916 systemd-logind[788]: Session 53 logged out. Waiting for processes to exit.
Dec  6 05:00:36 np0005548916 systemd-logind[788]: Removed session 53.
Dec  6 05:00:36 np0005548916 nova_compute[228576]: 2025-12-06 10:00:36.253 228580 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Dec  6 05:00:36 np0005548916 nova_compute[228576]: 2025-12-06 10:00:36.254 228580 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Dec  6 05:00:36 np0005548916 nova_compute[228576]: 2025-12-06 10:00:36.254 228580 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Dec  6 05:00:36 np0005548916 nova_compute[228576]: 2025-12-06 10:00:36.254 228580 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m
Dec  6 05:00:36 np0005548916 nova_compute[228576]: 2025-12-06 10:00:36.442 228580 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:00:36 np0005548916 nova_compute[228576]: 2025-12-06 10:00:36.456 228580 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.015s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:00:36 np0005548916 nova_compute[228576]: 2025-12-06 10:00:36.457 228580 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Dec  6 05:00:36 np0005548916 nova_compute[228576]: 2025-12-06 10:00:36.927 228580 INFO nova.virt.driver [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m
Dec  6 05:00:37 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:37 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4c40041f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.075 228580 INFO nova.compute.provider_config [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.084 228580 DEBUG oslo_concurrency.lockutils [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.084 228580 DEBUG oslo_concurrency.lockutils [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.084 228580 DEBUG oslo_concurrency.lockutils [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.085 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.085 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.085 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.085 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.085 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.085 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.086 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.086 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.086 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.086 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.086 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.086 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.087 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.087 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.087 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.087 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.087 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.087 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.087 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.088 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] console_host                   = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.088 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.088 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.088 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.088 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.088 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.088 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.089 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.089 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.089 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.089 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.089 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.089 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.090 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.090 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.090 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.090 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.090 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.090 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.091 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] host                           = compute-1.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.091 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.091 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.091 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.091 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.091 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.091 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.092 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.092 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.092 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.092 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.092 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.092 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.093 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.093 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.093 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.093 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.093 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.093 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.093 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.094 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.094 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.094 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.094 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.094 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.094 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.094 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.095 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.095 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.095 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.095 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.095 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.095 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.095 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.096 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.096 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.096 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.096 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.096 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.097 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.097 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.097 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.097 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] my_block_storage_ip            = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.097 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] my_ip                          = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.097 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.098 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.098 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.098 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.098 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.098 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.098 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.099 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.099 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.099 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.099 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.099 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.099 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.100 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.100 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.100 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.100 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.100 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.101 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.101 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.101 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.101 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.101 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.101 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.101 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.102 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.102 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.102 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.102 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.102 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.102 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.103 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.103 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.103 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.103 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.103 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.103 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.104 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.104 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.104 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.104 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.104 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.104 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.104 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.105 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.105 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.105 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.105 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.105 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.105 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.105 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.106 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.106 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.106 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.106 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.106 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.106 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.107 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.107 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.107 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.107 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.107 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.107 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.107 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.108 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.108 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.108 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.108 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.108 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.109 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.109 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.109 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.109 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.109 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.109 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.109 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.110 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.110 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.110 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.110 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.110 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.110 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.110 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.111 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.111 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.111 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.111 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.111 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.111 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.112 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.112 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.112 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.112 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.112 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.112 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.113 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.113 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.113 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.113 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.113 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.114 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.114 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.114 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.114 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.114 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.114 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.114 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.115 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.115 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.115 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.115 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.115 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.115 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.115 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.116 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.116 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.116 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.116 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.116 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.116 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.117 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.117 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.117 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.117 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.117 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.117 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.117 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.118 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.118 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.118 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.118 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.118 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.118 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.119 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.119 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.119 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.119 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.119 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.120 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.120 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.120 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.120 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.120 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.121 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.121 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.121 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.121 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.121 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.121 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.121 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.122 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.122 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.122 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.122 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.122 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.123 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.123 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.123 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.123 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.123 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.124 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.124 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.124 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.124 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.124 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.125 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.125 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.125 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.125 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.125 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.126 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.126 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.126 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.126 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.126 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.126 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.127 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.127 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.127 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.127 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.127 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.128 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.128 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.128 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.128 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.128 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.129 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.129 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.129 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.129 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.129 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.130 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.130 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.130 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.130 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.130 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.130 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.130 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.131 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.131 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.131 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.131 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.131 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.131 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.131 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.132 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.132 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.132 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.132 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.132 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.132 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.133 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.133 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.133 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.133 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.133 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.133 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.133 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.134 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.134 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.134 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.134 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.134 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.134 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.135 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.135 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.135 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.135 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.135 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.135 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.135 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.136 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.136 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.136 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.136 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.136 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.136 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.137 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.137 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.137 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.137 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.137 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.137 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.137 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.138 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.138 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.138 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.138 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.138 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.138 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.138 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.139 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.139 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.139 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.139 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.139 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.139 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.139 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.140 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.140 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.140 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.140 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.140 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.140 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.141 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.141 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.141 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.141 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.141 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.141 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.141 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.142 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.142 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.142 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.142 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.142 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.143 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.143 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.143 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.143 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.143 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.143 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.143 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.144 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.144 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.144 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.144 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.144 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.144 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.145 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.145 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.145 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.145 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.145 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.145 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.145 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.146 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.146 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.146 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.146 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.146 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.146 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.146 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.147 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.147 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.147 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.147 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.147 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.147 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.148 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.148 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.148 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.148 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.149 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.149 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.149 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.149 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.149 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.149 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.149 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.150 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.150 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.150 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.150 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.150 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.150 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.150 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.151 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.151 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.151 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.151 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.151 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.151 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.151 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.152 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.152 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.152 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.152 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.152 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.152 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.152 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.153 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.153 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.153 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.153 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.153 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.153 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.153 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.154 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.154 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.154 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.154 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.154 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.154 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.154 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.155 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.155 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.155 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.155 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.155 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.155 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.155 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.156 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.156 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.156 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.156 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.156 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.156 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.156 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.157 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.157 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.157 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.157 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.157 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.157 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.157 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.158 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.158 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.158 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.158 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.158 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.158 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.158 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.159 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.159 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.159 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.159 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.images_rbd_ceph_conf   = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.159 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.159 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.160 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.160 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.images_rbd_pool        = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.160 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.images_type            = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.160 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.160 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.160 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.161 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.161 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.161 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.161 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.161 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.161 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.162 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.162 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.162 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.162 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.162 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.162 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.163 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.163 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.163 228580 WARNING oslo_config.cfg [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Dec  6 05:00:37 np0005548916 nova_compute[228576]: live_migration_uri is deprecated for removal in favor of two other options that
Dec  6 05:00:37 np0005548916 nova_compute[228576]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Dec  6 05:00:37 np0005548916 nova_compute[228576]: and ``live_migration_inbound_addr`` respectively.
Dec  6 05:00:37 np0005548916 nova_compute[228576]: ).  Its value may be silently ignored in the future.#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.163 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.163 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.164 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.164 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.164 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.164 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.164 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.164 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.165 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.165 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.165 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.165 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.165 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.165 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.165 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.166 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.166 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.166 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.166 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.rbd_secret_uuid        = 5ecd3f74-dade-5fc4-92ce-8950ae424258 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.166 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.rbd_user               = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.166 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.167 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.167 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.167 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.167 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.167 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.167 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.167 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.168 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.168 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.168 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.168 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.168 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.168 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.169 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.169 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.169 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.169 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.169 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.169 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.169 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.170 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.170 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.170 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.170 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.170 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.170 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.171 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.171 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.171 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.171 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.171 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.171 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.172 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.172 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.172 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.172 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.172 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.172 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.173 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.173 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.173 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.173 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.173 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.173 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.173 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.174 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.174 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.174 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.174 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.174 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.174 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.174 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.175 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.175 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.175 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.175 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.175 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.176 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.176 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.176 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.176 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.176 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.177 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.177 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.177 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.177 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.177 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.177 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.177 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.178 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.178 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.178 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.178 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.178 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.178 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.179 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.179 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.179 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.179 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.179 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.179 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.179 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.180 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.180 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.180 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.180 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.180 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.180 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.180 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.181 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.181 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.181 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.181 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.181 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.181 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.181 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.182 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.182 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.182 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.182 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.182 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.182 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.182 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.183 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.183 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.183 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.183 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.183 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.183 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.183 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.184 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.184 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.184 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.184 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.184 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.184 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.185 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.185 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.185 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.185 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.185 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.185 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.186 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.186 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.186 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.186 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.186 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.186 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.186 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.187 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.187 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.187 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.187 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.187 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.187 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.187 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.188 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.188 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.188 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.188 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.188 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.188 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.188 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.189 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.189 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.189 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.189 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.189 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.189 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.189 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.190 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.190 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.190 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.190 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.190 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.190 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.191 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.191 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.191 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.191 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.191 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.191 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.192 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.192 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.192 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.192 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.192 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.192 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.192 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.193 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.193 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.193 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.193 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.193 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.193 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.193 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.194 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.194 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.194 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.194 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.194 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.194 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.195 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.195 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.195 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.195 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.195 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.195 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.195 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.196 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.196 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.196 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.196 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.196 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.196 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.196 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.197 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.197 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.197 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.197 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.197 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.197 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.198 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.198 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.198 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.198 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.198 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.198 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.199 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.199 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.199 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.199 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.199 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.200 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.200 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.200 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.200 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.200 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.200 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.201 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.201 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.201 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.201 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.201 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.202 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.202 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.202 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.202 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.203 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.203 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.203 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.203 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vnc.server_proxyclient_address = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.203 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.203 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.204 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.204 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.204 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.204 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.204 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.204 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.205 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.205 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.205 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.205 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.205 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.206 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.206 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.206 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.206 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.206 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.206 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.207 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.207 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.207 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.207 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.207 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.207 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.207 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.208 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.208 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.208 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.208 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.208 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.208 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.209 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.209 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.209 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.209 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.209 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.209 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.209 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.210 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.210 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.210 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.210 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.210 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.210 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.211 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.211 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.211 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.211 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.211 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.212 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.212 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.212 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.212 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.212 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.212 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.213 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.213 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.213 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.213 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.213 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.213 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.214 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.214 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.214 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.214 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.214 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.214 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.214 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.215 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.215 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.215 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.215 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.215 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.215 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.215 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.216 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.216 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.216 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.216 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.216 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.216 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.216 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.217 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.217 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.217 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.217 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.217 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.217 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.218 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.218 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.218 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.218 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.218 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.219 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.219 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.219 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.219 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.219 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.219 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.219 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.220 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.220 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.220 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.220 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.220 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.220 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.221 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.221 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.221 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.221 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.221 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.221 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.221 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.222 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.222 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.222 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.222 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.222 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.222 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.222 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.223 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.223 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.223 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.223 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.223 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.223 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.223 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.224 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.224 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.224 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.224 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.224 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.224 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.225 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.225 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.225 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.225 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.225 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.225 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.225 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.226 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.226 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.226 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.226 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.226 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.226 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.227 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.227 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.227 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.227 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.227 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.227 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.227 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.228 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.228 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.228 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.228 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.228 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.228 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.228 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.229 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.229 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.229 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.229 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.229 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.229 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.229 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.230 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.230 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.230 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.231 228580 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.260 228580 INFO nova.virt.node [None req-631af40b-fc70-41ee-854f-2584c3d4f164 - - - - - -] Determined node identity ff2f17cb-ff1d-4da7-9560-4be741380cb1 from /var/lib/nova/compute_id#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.261 228580 DEBUG nova.virt.libvirt.host [None req-631af40b-fc70-41ee-854f-2584c3d4f164 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.262 228580 DEBUG nova.virt.libvirt.host [None req-631af40b-fc70-41ee-854f-2584c3d4f164 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.262 228580 DEBUG nova.virt.libvirt.host [None req-631af40b-fc70-41ee-854f-2584c3d4f164 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.263 228580 DEBUG nova.virt.libvirt.host [None req-631af40b-fc70-41ee-854f-2584c3d4f164 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.278 228580 DEBUG nova.virt.libvirt.host [None req-631af40b-fc70-41ee-854f-2584c3d4f164 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7fb57c47a340> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.281 228580 DEBUG nova.virt.libvirt.host [None req-631af40b-fc70-41ee-854f-2584c3d4f164 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7fb57c47a340> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.282 228580 INFO nova.virt.libvirt.driver [None req-631af40b-fc70-41ee-854f-2584c3d4f164 - - - - - -] Connection event '1' reason 'None'#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.289 228580 INFO nova.virt.libvirt.host [None req-631af40b-fc70-41ee-854f-2584c3d4f164 - - - - - -] Libvirt host capabilities <capabilities>
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 
Dec  6 05:00:37 np0005548916 nova_compute[228576]:  <host>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    <uuid>9a5f3f62-e1ed-4c63-8d00-a3c5e56bbddc</uuid>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    <cpu>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <arch>x86_64</arch>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model>EPYC-Rome-v4</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <vendor>AMD</vendor>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <microcode version='16777317'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <signature family='23' model='49' stepping='0'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <maxphysaddr mode='emulate' bits='40'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <feature name='x2apic'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <feature name='tsc-deadline'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <feature name='osxsave'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <feature name='hypervisor'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <feature name='tsc_adjust'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <feature name='spec-ctrl'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <feature name='stibp'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <feature name='arch-capabilities'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <feature name='ssbd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <feature name='cmp_legacy'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <feature name='topoext'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <feature name='virt-ssbd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <feature name='lbrv'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <feature name='tsc-scale'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <feature name='vmcb-clean'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <feature name='pause-filter'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <feature name='pfthreshold'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <feature name='svme-addr-chk'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <feature name='rdctl-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <feature name='skip-l1dfl-vmentry'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <feature name='mds-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <feature name='pschange-mc-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <pages unit='KiB' size='4'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <pages unit='KiB' size='2048'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <pages unit='KiB' size='1048576'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    </cpu>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    <power_management>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <suspend_mem/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    </power_management>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    <iommu support='no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    <migration_features>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <live/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <uri_transports>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <uri_transport>tcp</uri_transport>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <uri_transport>rdma</uri_transport>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </uri_transports>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    </migration_features>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    <topology>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <cells num='1'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <cell id='0'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:          <memory unit='KiB'>7864312</memory>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:          <pages unit='KiB' size='4'>1966078</pages>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:          <pages unit='KiB' size='2048'>0</pages>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:          <pages unit='KiB' size='1048576'>0</pages>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:          <distances>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:            <sibling id='0' value='10'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:          </distances>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:          <cpus num='8'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:            <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:            <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:            <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:            <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:            <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:            <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:            <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:            <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:          </cpus>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        </cell>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </cells>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    </topology>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    <cache>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    </cache>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    <secmodel>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model>selinux</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <doi>0</doi>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    </secmodel>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    <secmodel>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model>dac</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <doi>0</doi>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <baselabel type='kvm'>+107:+107</baselabel>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <baselabel type='qemu'>+107:+107</baselabel>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    </secmodel>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:  </host>
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 
Dec  6 05:00:37 np0005548916 nova_compute[228576]:  <guest>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    <os_type>hvm</os_type>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    <arch name='i686'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <wordsize>32</wordsize>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <domain type='qemu'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <domain type='kvm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    </arch>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    <features>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <pae/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <nonpae/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <acpi default='on' toggle='yes'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <apic default='on' toggle='no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <cpuselection/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <deviceboot/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <disksnapshot default='on' toggle='no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <externalSnapshot/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    </features>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:  </guest>
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 
Dec  6 05:00:37 np0005548916 nova_compute[228576]:  <guest>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    <os_type>hvm</os_type>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    <arch name='x86_64'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <wordsize>64</wordsize>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <domain type='qemu'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <domain type='kvm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    </arch>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    <features>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <acpi default='on' toggle='yes'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <apic default='on' toggle='no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <cpuselection/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <deviceboot/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <disksnapshot default='on' toggle='no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <externalSnapshot/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    </features>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:  </guest>
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 
Dec  6 05:00:37 np0005548916 nova_compute[228576]: </capabilities>
Dec  6 05:00:37 np0005548916 nova_compute[228576]: #033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.296 228580 DEBUG nova.virt.libvirt.volume.mount [None req-631af40b-fc70-41ee-854f-2584c3d4f164 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.297 228580 DEBUG nova.virt.libvirt.host [None req-631af40b-fc70-41ee-854f-2584c3d4f164 - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.302 228580 DEBUG nova.virt.libvirt.host [None req-631af40b-fc70-41ee-854f-2584c3d4f164 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Dec  6 05:00:37 np0005548916 nova_compute[228576]: <domainCapabilities>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:  <path>/usr/libexec/qemu-kvm</path>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:  <domain>kvm</domain>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:  <machine>pc-q35-rhel9.8.0</machine>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:  <arch>i686</arch>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:  <vcpu max='4096'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:  <iothreads supported='yes'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:  <os supported='yes'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    <enum name='firmware'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    <loader supported='yes'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <enum name='type'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>rom</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>pflash</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </enum>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <enum name='readonly'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>yes</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>no</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </enum>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <enum name='secure'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>no</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </enum>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    </loader>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:  </os>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:  <cpu>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    <mode name='host-passthrough' supported='yes'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <enum name='hostPassthroughMigratable'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>on</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>off</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </enum>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    </mode>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    <mode name='maximum' supported='yes'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <enum name='maximumMigratable'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>on</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>off</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </enum>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    </mode>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    <mode name='host-model' supported='yes'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model fallback='forbid'>EPYC-Rome</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <vendor>AMD</vendor>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <maxphysaddr mode='passthrough' limit='40'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <feature policy='require' name='x2apic'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <feature policy='require' name='tsc-deadline'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <feature policy='require' name='hypervisor'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <feature policy='require' name='tsc_adjust'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <feature policy='require' name='spec-ctrl'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <feature policy='require' name='stibp'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <feature policy='require' name='ssbd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <feature policy='require' name='cmp_legacy'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <feature policy='require' name='overflow-recov'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <feature policy='require' name='succor'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <feature policy='require' name='ibrs'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <feature policy='require' name='amd-ssbd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <feature policy='require' name='virt-ssbd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <feature policy='require' name='lbrv'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <feature policy='require' name='tsc-scale'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <feature policy='require' name='vmcb-clean'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <feature policy='require' name='flushbyasid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <feature policy='require' name='pause-filter'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <feature policy='require' name='pfthreshold'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <feature policy='require' name='svme-addr-chk'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <feature policy='require' name='lfence-always-serializing'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <feature policy='disable' name='xsaves'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    </mode>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    <mode name='custom' supported='yes'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Broadwell'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Broadwell-IBRS'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Broadwell-noTSX'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Broadwell-noTSX-IBRS'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Broadwell-v1'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Broadwell-v2'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Broadwell-v3'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Broadwell-v4'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Cascadelake-Server'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Cascadelake-Server-noTSX'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Cascadelake-Server-v1'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Cascadelake-Server-v2'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Cascadelake-Server-v3'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Cascadelake-Server-v4'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Cascadelake-Server-v5'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Cooperlake'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-bf16'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='taa-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Cooperlake-v1'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-bf16'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='taa-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Cooperlake-v2'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-bf16'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='taa-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Denverton'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='mpx'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Denverton-v1'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='mpx'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Denverton-v2'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Denverton-v3'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Dhyana-v2'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='EPYC-Genoa'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='amd-psfd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='auto-ibrs'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-bf16'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bitalg'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512ifma'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vbmi'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fsrm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='la57'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='no-nested-data-bp'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='null-sel-clr-base'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='stibp-always-on'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vaes'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='EPYC-Genoa-v1'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='amd-psfd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='auto-ibrs'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-bf16'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bitalg'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512ifma'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vbmi'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fsrm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='la57'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='no-nested-data-bp'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='null-sel-clr-base'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='stibp-always-on'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vaes'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='EPYC-Milan'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fsrm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='EPYC-Milan-v1'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fsrm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='EPYC-Milan-v2'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='amd-psfd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fsrm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='no-nested-data-bp'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='null-sel-clr-base'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='stibp-always-on'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vaes'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='EPYC-Rome'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='EPYC-Rome-v1'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='EPYC-Rome-v2'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='EPYC-Rome-v3'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='EPYC-v3'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='EPYC-v4'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='GraniteRapids'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='amx-bf16'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='amx-fp16'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='amx-int8'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='amx-tile'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx-vnni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-bf16'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-fp16'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bitalg'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512ifma'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vbmi'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fbsdp-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fsrc'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fsrm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fsrs'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fzrm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='la57'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='mcdt-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pbrsb-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='prefetchiti'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='psdp-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='sbdr-ssdp-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='serialize'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='taa-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='tsx-ldtrk'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vaes'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xfd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='GraniteRapids-v1'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='amx-bf16'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='amx-fp16'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='amx-int8'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='amx-tile'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx-vnni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-bf16'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-fp16'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bitalg'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512ifma'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vbmi'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fbsdp-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fsrc'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fsrm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fsrs'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fzrm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='la57'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='mcdt-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pbrsb-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='prefetchiti'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='psdp-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='sbdr-ssdp-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='serialize'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='taa-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='tsx-ldtrk'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vaes'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xfd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='GraniteRapids-v2'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='amx-bf16'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='amx-fp16'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='amx-int8'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='amx-tile'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx-vnni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx10'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx10-128'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx10-256'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx10-512'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-bf16'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-fp16'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bitalg'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512ifma'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vbmi'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='cldemote'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fbsdp-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fsrc'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fsrm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fsrs'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fzrm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='la57'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='mcdt-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='movdir64b'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='movdiri'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pbrsb-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='prefetchiti'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='psdp-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='sbdr-ssdp-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='serialize'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='ss'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='taa-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='tsx-ldtrk'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vaes'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xfd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Haswell'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Haswell-IBRS'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Haswell-noTSX'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Haswell-noTSX-IBRS'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Haswell-v1'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Haswell-v2'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Haswell-v3'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Haswell-v4'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Icelake-Server'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bitalg'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vbmi'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='la57'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vaes'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Icelake-Server-noTSX'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bitalg'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vbmi'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='la57'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vaes'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Icelake-Server-v1'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bitalg'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vbmi'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='la57'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vaes'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Icelake-Server-v2'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bitalg'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vbmi'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='la57'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vaes'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Icelake-Server-v3'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bitalg'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vbmi'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='la57'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='taa-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vaes'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Icelake-Server-v4'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bitalg'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512ifma'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vbmi'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fsrm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='la57'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='taa-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vaes'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Icelake-Server-v5'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bitalg'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512ifma'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vbmi'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fsrm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='la57'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='taa-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vaes'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Icelake-Server-v6'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bitalg'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512ifma'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vbmi'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fsrm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='la57'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='taa-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vaes'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Icelake-Server-v7'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bitalg'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512ifma'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vbmi'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fsrm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='la57'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='taa-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vaes'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='IvyBridge'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='IvyBridge-IBRS'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='IvyBridge-v1'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='IvyBridge-v2'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='KnightsMill'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-4fmaps'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-4vnniw'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512er'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512pf'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='ss'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='KnightsMill-v1'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-4fmaps'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-4vnniw'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512er'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512pf'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='ss'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Opteron_G4'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fma4'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xop'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Opteron_G4-v1'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fma4'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xop'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Opteron_G5'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fma4'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='tbm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xop'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Opteron_G5-v1'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fma4'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='tbm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xop'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='SapphireRapids'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='amx-bf16'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='amx-int8'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='amx-tile'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx-vnni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-bf16'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-fp16'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bitalg'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512ifma'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vbmi'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fsrc'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fsrm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fsrs'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fzrm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='la57'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='serialize'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='taa-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='tsx-ldtrk'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vaes'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xfd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='SapphireRapids-v1'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='amx-bf16'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='amx-int8'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='amx-tile'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx-vnni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-bf16'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-fp16'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bitalg'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512ifma'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vbmi'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fsrc'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fsrm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fsrs'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fzrm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='la57'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='serialize'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='taa-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='tsx-ldtrk'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vaes'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xfd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='SapphireRapids-v2'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='amx-bf16'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='amx-int8'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='amx-tile'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx-vnni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-bf16'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-fp16'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bitalg'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512ifma'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vbmi'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fbsdp-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fsrc'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fsrm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fsrs'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fzrm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='la57'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='psdp-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='sbdr-ssdp-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='serialize'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='taa-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='tsx-ldtrk'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vaes'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xfd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='SapphireRapids-v3'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='amx-bf16'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='amx-int8'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='amx-tile'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx-vnni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-bf16'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-fp16'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bitalg'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512ifma'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vbmi'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='cldemote'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fbsdp-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fsrc'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fsrm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fsrs'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fzrm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='la57'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='movdir64b'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='movdiri'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='psdp-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='sbdr-ssdp-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='serialize'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='ss'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='taa-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='tsx-ldtrk'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vaes'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xfd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='SierraForest'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx-ifma'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx-ne-convert'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx-vnni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx-vnni-int8'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='cmpccxadd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fbsdp-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fsrm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fsrs'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='mcdt-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pbrsb-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='psdp-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='sbdr-ssdp-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='serialize'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vaes'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='SierraForest-v1'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx-ifma'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx-ne-convert'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx-vnni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx-vnni-int8'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='cmpccxadd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fbsdp-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fsrm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fsrs'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='mcdt-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pbrsb-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='psdp-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='sbdr-ssdp-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='serialize'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vaes'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Skylake-Client'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Skylake-Client-IBRS'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Skylake-Client-v1'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Skylake-Client-v2'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Skylake-Client-v3'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Skylake-Client-v4'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Skylake-Server'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Skylake-Server-IBRS'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Skylake-Server-v1'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Skylake-Server-v2'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Skylake-Server-v3'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Skylake-Server-v4'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Skylake-Server-v5'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Snowridge'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='cldemote'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='core-capability'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='movdir64b'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='movdiri'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='mpx'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='split-lock-detect'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Snowridge-v1'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='cldemote'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='core-capability'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='movdir64b'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='movdiri'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='mpx'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='split-lock-detect'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Snowridge-v2'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='cldemote'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='core-capability'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='movdir64b'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='movdiri'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='split-lock-detect'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Snowridge-v3'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='cldemote'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='core-capability'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='movdir64b'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='movdiri'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='split-lock-detect'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Snowridge-v4'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='cldemote'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='movdir64b'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='movdiri'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='athlon'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='3dnow'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='3dnowext'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='athlon-v1'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='3dnow'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='3dnowext'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='core2duo'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='ss'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='core2duo-v1'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='ss'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='coreduo'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='ss'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='coreduo-v1'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='ss'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='n270'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='ss'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='n270-v1'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='ss'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='phenom'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='3dnow'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='3dnowext'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='phenom-v1'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='3dnow'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='3dnowext'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    </mode>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:  </cpu>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:  <memoryBacking supported='yes'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    <enum name='sourceType'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <value>file</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <value>anonymous</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <value>memfd</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    </enum>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:  </memoryBacking>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:  <devices>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    <disk supported='yes'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <enum name='diskDevice'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>disk</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>cdrom</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>floppy</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>lun</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </enum>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <enum name='bus'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>fdc</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>scsi</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>virtio</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>usb</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>sata</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </enum>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <enum name='model'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>virtio</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>virtio-transitional</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>virtio-non-transitional</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </enum>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    </disk>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    <graphics supported='yes'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <enum name='type'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>vnc</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>egl-headless</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>dbus</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </enum>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    </graphics>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    <video supported='yes'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <enum name='modelType'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>vga</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>cirrus</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>virtio</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>none</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>bochs</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>ramfb</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </enum>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    </video>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    <hostdev supported='yes'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <enum name='mode'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>subsystem</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </enum>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <enum name='startupPolicy'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>default</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>mandatory</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>requisite</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>optional</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </enum>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <enum name='subsysType'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>usb</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>pci</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>scsi</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </enum>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <enum name='capsType'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <enum name='pciBackend'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    </hostdev>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    <rng supported='yes'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <enum name='model'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>virtio</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>virtio-transitional</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>virtio-non-transitional</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </enum>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <enum name='backendModel'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>random</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>egd</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>builtin</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </enum>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    </rng>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    <filesystem supported='yes'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <enum name='driverType'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>path</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>handle</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>virtiofs</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </enum>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    </filesystem>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    <tpm supported='yes'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <enum name='model'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>tpm-tis</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>tpm-crb</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </enum>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <enum name='backendModel'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>emulator</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>external</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </enum>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <enum name='backendVersion'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>2.0</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </enum>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    </tpm>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    <redirdev supported='yes'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <enum name='bus'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>usb</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </enum>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    </redirdev>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    <channel supported='yes'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <enum name='type'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>pty</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>unix</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </enum>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    </channel>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    <crypto supported='yes'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <enum name='model'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <enum name='type'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>qemu</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </enum>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <enum name='backendModel'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>builtin</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </enum>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    </crypto>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    <interface supported='yes'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <enum name='backendType'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>default</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>passt</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </enum>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    </interface>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    <panic supported='yes'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <enum name='model'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>isa</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>hyperv</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </enum>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    </panic>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    <console supported='yes'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <enum name='type'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>null</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>vc</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>pty</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>dev</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>file</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>pipe</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>stdio</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>udp</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>tcp</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>unix</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>qemu-vdagent</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>dbus</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </enum>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    </console>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:  </devices>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:  <features>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    <gic supported='no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    <vmcoreinfo supported='yes'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    <genid supported='yes'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    <backingStoreInput supported='yes'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    <backup supported='yes'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    <async-teardown supported='yes'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    <ps2 supported='yes'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    <sev supported='no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    <sgx supported='no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    <hyperv supported='yes'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <enum name='features'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>relaxed</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>vapic</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>spinlocks</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>vpindex</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>runtime</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>synic</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>stimer</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>reset</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>vendor_id</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>frequencies</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>reenlightenment</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>tlbflush</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>ipi</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>avic</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>emsr_bitmap</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>xmm_input</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </enum>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <defaults>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <spinlocks>4095</spinlocks>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <stimer_direct>on</stimer_direct>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <tlbflush_direct>on</tlbflush_direct>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <tlbflush_extended>on</tlbflush_extended>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <vendor_id>Linux KVM Hv</vendor_id>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </defaults>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    </hyperv>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    <launchSecurity supported='yes'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <enum name='sectype'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>tdx</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </enum>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    </launchSecurity>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:  </features>
Dec  6 05:00:37 np0005548916 nova_compute[228576]: </domainCapabilities>
Dec  6 05:00:37 np0005548916 nova_compute[228576]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.308 228580 DEBUG nova.virt.libvirt.host [None req-631af40b-fc70-41ee-854f-2584c3d4f164 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Dec  6 05:00:37 np0005548916 nova_compute[228576]: <domainCapabilities>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:  <path>/usr/libexec/qemu-kvm</path>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:  <domain>kvm</domain>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:  <machine>pc-i440fx-rhel7.6.0</machine>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:  <arch>i686</arch>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:  <vcpu max='240'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:  <iothreads supported='yes'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:  <os supported='yes'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    <enum name='firmware'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    <loader supported='yes'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <enum name='type'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>rom</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>pflash</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </enum>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <enum name='readonly'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>yes</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>no</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </enum>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <enum name='secure'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>no</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </enum>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    </loader>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:  </os>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:  <cpu>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    <mode name='host-passthrough' supported='yes'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <enum name='hostPassthroughMigratable'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>on</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>off</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </enum>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    </mode>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    <mode name='maximum' supported='yes'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <enum name='maximumMigratable'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>on</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>off</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </enum>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    </mode>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    <mode name='host-model' supported='yes'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model fallback='forbid'>EPYC-Rome</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <vendor>AMD</vendor>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <maxphysaddr mode='passthrough' limit='40'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <feature policy='require' name='x2apic'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <feature policy='require' name='tsc-deadline'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <feature policy='require' name='hypervisor'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <feature policy='require' name='tsc_adjust'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <feature policy='require' name='spec-ctrl'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <feature policy='require' name='stibp'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <feature policy='require' name='ssbd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <feature policy='require' name='cmp_legacy'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <feature policy='require' name='overflow-recov'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <feature policy='require' name='succor'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <feature policy='require' name='ibrs'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <feature policy='require' name='amd-ssbd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <feature policy='require' name='virt-ssbd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <feature policy='require' name='lbrv'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <feature policy='require' name='tsc-scale'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <feature policy='require' name='vmcb-clean'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <feature policy='require' name='flushbyasid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <feature policy='require' name='pause-filter'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <feature policy='require' name='pfthreshold'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <feature policy='require' name='svme-addr-chk'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <feature policy='require' name='lfence-always-serializing'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <feature policy='disable' name='xsaves'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    </mode>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    <mode name='custom' supported='yes'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Broadwell'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Broadwell-IBRS'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Broadwell-noTSX'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Broadwell-noTSX-IBRS'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Broadwell-v1'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Broadwell-v2'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Broadwell-v3'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Broadwell-v4'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Cascadelake-Server'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Cascadelake-Server-noTSX'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Cascadelake-Server-v1'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Cascadelake-Server-v2'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Cascadelake-Server-v3'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Cascadelake-Server-v4'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Cascadelake-Server-v5'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Cooperlake'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-bf16'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='taa-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Cooperlake-v1'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-bf16'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='taa-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Cooperlake-v2'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-bf16'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='taa-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Denverton'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='mpx'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Denverton-v1'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='mpx'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Denverton-v2'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Denverton-v3'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Dhyana-v2'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='EPYC-Genoa'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='amd-psfd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='auto-ibrs'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-bf16'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bitalg'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512ifma'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vbmi'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fsrm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='la57'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='no-nested-data-bp'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='null-sel-clr-base'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='stibp-always-on'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vaes'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='EPYC-Genoa-v1'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='amd-psfd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='auto-ibrs'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-bf16'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bitalg'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512ifma'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vbmi'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fsrm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='la57'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='no-nested-data-bp'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='null-sel-clr-base'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='stibp-always-on'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vaes'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='EPYC-Milan'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fsrm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='EPYC-Milan-v1'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fsrm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='EPYC-Milan-v2'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='amd-psfd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fsrm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='no-nested-data-bp'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='null-sel-clr-base'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='stibp-always-on'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vaes'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='EPYC-Rome'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='EPYC-Rome-v1'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='EPYC-Rome-v2'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='EPYC-Rome-v3'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='EPYC-v3'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='EPYC-v4'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='GraniteRapids'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='amx-bf16'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='amx-fp16'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='amx-int8'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='amx-tile'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx-vnni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-bf16'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-fp16'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bitalg'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512ifma'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vbmi'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fbsdp-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fsrc'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fsrm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fsrs'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fzrm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='la57'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='mcdt-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pbrsb-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='prefetchiti'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='psdp-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='sbdr-ssdp-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='serialize'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='taa-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='tsx-ldtrk'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vaes'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xfd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='GraniteRapids-v1'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='amx-bf16'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='amx-fp16'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='amx-int8'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='amx-tile'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx-vnni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-bf16'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-fp16'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bitalg'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512ifma'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vbmi'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fbsdp-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fsrc'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fsrm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fsrs'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fzrm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='la57'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='mcdt-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pbrsb-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='prefetchiti'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='psdp-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='sbdr-ssdp-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='serialize'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='taa-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='tsx-ldtrk'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vaes'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xfd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='GraniteRapids-v2'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='amx-bf16'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='amx-fp16'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='amx-int8'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='amx-tile'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx-vnni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx10'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx10-128'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx10-256'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx10-512'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-bf16'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-fp16'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bitalg'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512ifma'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vbmi'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='cldemote'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fbsdp-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fsrc'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fsrm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fsrs'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fzrm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='la57'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='mcdt-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='movdir64b'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='movdiri'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pbrsb-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='prefetchiti'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='psdp-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='sbdr-ssdp-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='serialize'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='ss'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='taa-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='tsx-ldtrk'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vaes'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xfd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Haswell'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Haswell-IBRS'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Haswell-noTSX'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Haswell-noTSX-IBRS'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Haswell-v1'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Haswell-v2'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Haswell-v3'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Haswell-v4'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Icelake-Server'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bitalg'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vbmi'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='la57'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vaes'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Icelake-Server-noTSX'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bitalg'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vbmi'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='la57'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vaes'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Icelake-Server-v1'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bitalg'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vbmi'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='la57'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vaes'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Icelake-Server-v2'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bitalg'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vbmi'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='la57'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vaes'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Icelake-Server-v3'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bitalg'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vbmi'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='la57'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='taa-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vaes'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Icelake-Server-v4'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bitalg'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512ifma'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vbmi'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fsrm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='la57'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='taa-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vaes'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Icelake-Server-v5'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bitalg'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512ifma'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vbmi'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fsrm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='la57'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='taa-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vaes'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Icelake-Server-v6'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bitalg'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512ifma'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vbmi'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fsrm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='la57'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='taa-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vaes'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Icelake-Server-v7'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bitalg'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512ifma'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vbmi'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fsrm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='la57'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='taa-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vaes'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='IvyBridge'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='IvyBridge-IBRS'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='IvyBridge-v1'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='IvyBridge-v2'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='KnightsMill'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-4fmaps'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-4vnniw'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512er'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512pf'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='ss'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='KnightsMill-v1'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-4fmaps'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-4vnniw'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512er'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512pf'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='ss'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Opteron_G4'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fma4'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xop'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Opteron_G4-v1'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fma4'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xop'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Opteron_G5'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fma4'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='tbm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xop'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Opteron_G5-v1'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fma4'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='tbm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xop'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='SapphireRapids'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='amx-bf16'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='amx-int8'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='amx-tile'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx-vnni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-bf16'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-fp16'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bitalg'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512ifma'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vbmi'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fsrc'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fsrm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fsrs'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fzrm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='la57'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='serialize'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='taa-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='tsx-ldtrk'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vaes'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xfd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='SapphireRapids-v1'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='amx-bf16'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='amx-int8'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='amx-tile'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx-vnni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-bf16'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-fp16'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bitalg'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512ifma'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vbmi'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fsrc'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fsrm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fsrs'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fzrm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='la57'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='serialize'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='taa-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='tsx-ldtrk'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vaes'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xfd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='SapphireRapids-v2'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='amx-bf16'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='amx-int8'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='amx-tile'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx-vnni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-bf16'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-fp16'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bitalg'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512ifma'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vbmi'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fbsdp-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fsrc'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fsrm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fsrs'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fzrm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='la57'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='psdp-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='sbdr-ssdp-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='serialize'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='taa-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='tsx-ldtrk'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vaes'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xfd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='SapphireRapids-v3'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='amx-bf16'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='amx-int8'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='amx-tile'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx-vnni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-bf16'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-fp16'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bitalg'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512ifma'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vbmi'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='cldemote'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fbsdp-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fsrc'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fsrm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fsrs'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fzrm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='la57'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='movdir64b'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='movdiri'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='psdp-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='sbdr-ssdp-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='serialize'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='ss'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='taa-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='tsx-ldtrk'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vaes'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xfd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='SierraForest'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx-ifma'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx-ne-convert'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx-vnni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx-vnni-int8'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='cmpccxadd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fbsdp-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fsrm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fsrs'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='mcdt-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pbrsb-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='psdp-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='sbdr-ssdp-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='serialize'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vaes'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='SierraForest-v1'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx-ifma'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx-ne-convert'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx-vnni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx-vnni-int8'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='cmpccxadd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fbsdp-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fsrm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fsrs'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='mcdt-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pbrsb-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='psdp-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='sbdr-ssdp-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='serialize'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vaes'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Skylake-Client'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Skylake-Client-IBRS'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Skylake-Client-v1'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Skylake-Client-v2'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Skylake-Client-v3'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Skylake-Client-v4'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Skylake-Server'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Skylake-Server-IBRS'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Skylake-Server-v1'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Skylake-Server-v2'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Skylake-Server-v3'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Skylake-Server-v4'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Skylake-Server-v5'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Snowridge'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='cldemote'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='core-capability'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='movdir64b'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='movdiri'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='mpx'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='split-lock-detect'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Snowridge-v1'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='cldemote'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='core-capability'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='movdir64b'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='movdiri'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='mpx'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='split-lock-detect'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Snowridge-v2'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='cldemote'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='core-capability'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='movdir64b'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='movdiri'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='split-lock-detect'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Snowridge-v3'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='cldemote'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='core-capability'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='movdir64b'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='movdiri'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='split-lock-detect'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Snowridge-v4'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='cldemote'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='movdir64b'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='movdiri'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='athlon'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='3dnow'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='3dnowext'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='athlon-v1'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='3dnow'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='3dnowext'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='core2duo'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='ss'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='core2duo-v1'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='ss'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='coreduo'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='ss'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='coreduo-v1'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='ss'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='n270'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='ss'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='n270-v1'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='ss'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='phenom'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='3dnow'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='3dnowext'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='phenom-v1'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='3dnow'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='3dnowext'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    </mode>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:  </cpu>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:  <memoryBacking supported='yes'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    <enum name='sourceType'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <value>file</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <value>anonymous</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <value>memfd</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    </enum>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:  </memoryBacking>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:  <devices>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    <disk supported='yes'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <enum name='diskDevice'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>disk</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>cdrom</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>floppy</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>lun</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </enum>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <enum name='bus'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>ide</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>fdc</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>scsi</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>virtio</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>usb</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>sata</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </enum>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <enum name='model'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>virtio</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>virtio-transitional</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>virtio-non-transitional</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </enum>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    </disk>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    <graphics supported='yes'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <enum name='type'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>vnc</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>egl-headless</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>dbus</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </enum>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    </graphics>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    <video supported='yes'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <enum name='modelType'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>vga</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>cirrus</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>virtio</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>none</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>bochs</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>ramfb</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </enum>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    </video>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    <hostdev supported='yes'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <enum name='mode'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>subsystem</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </enum>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <enum name='startupPolicy'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>default</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>mandatory</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>requisite</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>optional</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </enum>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <enum name='subsysType'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>usb</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>pci</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>scsi</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </enum>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <enum name='capsType'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <enum name='pciBackend'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    </hostdev>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    <rng supported='yes'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <enum name='model'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>virtio</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>virtio-transitional</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>virtio-non-transitional</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </enum>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <enum name='backendModel'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>random</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>egd</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>builtin</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </enum>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    </rng>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    <filesystem supported='yes'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <enum name='driverType'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>path</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>handle</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>virtiofs</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </enum>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    </filesystem>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    <tpm supported='yes'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <enum name='model'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>tpm-tis</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>tpm-crb</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </enum>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <enum name='backendModel'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>emulator</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>external</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </enum>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <enum name='backendVersion'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>2.0</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </enum>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    </tpm>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    <redirdev supported='yes'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <enum name='bus'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>usb</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </enum>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    </redirdev>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    <channel supported='yes'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <enum name='type'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>pty</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>unix</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </enum>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    </channel>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    <crypto supported='yes'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <enum name='model'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <enum name='type'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>qemu</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </enum>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <enum name='backendModel'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>builtin</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </enum>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    </crypto>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    <interface supported='yes'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <enum name='backendType'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>default</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>passt</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </enum>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    </interface>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    <panic supported='yes'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <enum name='model'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>isa</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>hyperv</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </enum>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    </panic>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    <console supported='yes'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <enum name='type'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>null</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>vc</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>pty</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>dev</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>file</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>pipe</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>stdio</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>udp</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>tcp</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>unix</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>qemu-vdagent</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>dbus</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </enum>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    </console>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:  </devices>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:  <features>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    <gic supported='no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    <vmcoreinfo supported='yes'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    <genid supported='yes'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    <backingStoreInput supported='yes'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    <backup supported='yes'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    <async-teardown supported='yes'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    <ps2 supported='yes'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    <sev supported='no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    <sgx supported='no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    <hyperv supported='yes'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <enum name='features'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>relaxed</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>vapic</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>spinlocks</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>vpindex</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>runtime</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>synic</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>stimer</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>reset</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>vendor_id</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>frequencies</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>reenlightenment</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>tlbflush</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>ipi</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>avic</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>emsr_bitmap</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>xmm_input</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </enum>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <defaults>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <spinlocks>4095</spinlocks>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <stimer_direct>on</stimer_direct>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <tlbflush_direct>on</tlbflush_direct>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <tlbflush_extended>on</tlbflush_extended>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <vendor_id>Linux KVM Hv</vendor_id>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </defaults>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    </hyperv>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    <launchSecurity supported='yes'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <enum name='sectype'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>tdx</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </enum>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    </launchSecurity>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:  </features>
Dec  6 05:00:37 np0005548916 nova_compute[228576]: </domainCapabilities>
Dec  6 05:00:37 np0005548916 nova_compute[228576]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.337 228580 DEBUG nova.virt.libvirt.host [None req-631af40b-fc70-41ee-854f-2584c3d4f164 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Dec  6 05:00:37 np0005548916 nova_compute[228576]: 2025-12-06 10:00:37.342 228580 DEBUG nova.virt.libvirt.host [None req-631af40b-fc70-41ee-854f-2584c3d4f164 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Dec  6 05:00:37 np0005548916 nova_compute[228576]: <domainCapabilities>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:  <path>/usr/libexec/qemu-kvm</path>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:  <domain>kvm</domain>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:  <machine>pc-q35-rhel9.8.0</machine>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:  <arch>x86_64</arch>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:  <vcpu max='4096'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:  <iothreads supported='yes'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:  <os supported='yes'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    <enum name='firmware'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <value>efi</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    </enum>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    <loader supported='yes'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <enum name='type'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>rom</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>pflash</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </enum>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <enum name='readonly'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>yes</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>no</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </enum>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <enum name='secure'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>yes</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>no</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </enum>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    </loader>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:  </os>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:  <cpu>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    <mode name='host-passthrough' supported='yes'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <enum name='hostPassthroughMigratable'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>on</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>off</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </enum>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    </mode>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    <mode name='maximum' supported='yes'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <enum name='maximumMigratable'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>on</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>off</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </enum>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    </mode>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    <mode name='host-model' supported='yes'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model fallback='forbid'>EPYC-Rome</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <vendor>AMD</vendor>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <maxphysaddr mode='passthrough' limit='40'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <feature policy='require' name='x2apic'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <feature policy='require' name='tsc-deadline'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <feature policy='require' name='hypervisor'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <feature policy='require' name='tsc_adjust'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <feature policy='require' name='spec-ctrl'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <feature policy='require' name='stibp'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <feature policy='require' name='ssbd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <feature policy='require' name='cmp_legacy'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <feature policy='require' name='overflow-recov'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <feature policy='require' name='succor'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <feature policy='require' name='ibrs'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <feature policy='require' name='amd-ssbd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <feature policy='require' name='virt-ssbd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <feature policy='require' name='lbrv'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <feature policy='require' name='tsc-scale'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <feature policy='require' name='vmcb-clean'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <feature policy='require' name='flushbyasid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <feature policy='require' name='pause-filter'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <feature policy='require' name='pfthreshold'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <feature policy='require' name='svme-addr-chk'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <feature policy='require' name='lfence-always-serializing'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <feature policy='disable' name='xsaves'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    </mode>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    <mode name='custom' supported='yes'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Broadwell'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Broadwell-IBRS'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Broadwell-noTSX'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Broadwell-noTSX-IBRS'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Broadwell-v1'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Broadwell-v2'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Broadwell-v3'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Broadwell-v4'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Cascadelake-Server'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Cascadelake-Server-noTSX'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Cascadelake-Server-v1'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Cascadelake-Server-v2'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Cascadelake-Server-v3'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Cascadelake-Server-v4'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Cascadelake-Server-v5'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Cooperlake'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-bf16'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='taa-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Cooperlake-v1'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-bf16'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='taa-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Cooperlake-v2'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-bf16'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='taa-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Denverton'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='mpx'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Denverton-v1'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='mpx'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Denverton-v2'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Denverton-v3'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Dhyana-v2'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='EPYC-Genoa'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='amd-psfd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='auto-ibrs'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-bf16'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bitalg'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512ifma'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vbmi'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fsrm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='la57'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='no-nested-data-bp'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='null-sel-clr-base'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='stibp-always-on'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vaes'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='EPYC-Genoa-v1'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='amd-psfd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='auto-ibrs'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-bf16'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bitalg'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512ifma'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vbmi'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fsrm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='la57'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='no-nested-data-bp'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='null-sel-clr-base'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='stibp-always-on'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vaes'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='EPYC-Milan'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fsrm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='EPYC-Milan-v1'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fsrm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='EPYC-Milan-v2'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='amd-psfd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fsrm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='no-nested-data-bp'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='null-sel-clr-base'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='stibp-always-on'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vaes'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='EPYC-Rome'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='EPYC-Rome-v1'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='EPYC-Rome-v2'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='EPYC-Rome-v3'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='EPYC-v3'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='EPYC-v4'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='GraniteRapids'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='amx-bf16'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='amx-fp16'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='amx-int8'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='amx-tile'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx-vnni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-bf16'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-fp16'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bitalg'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512ifma'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vbmi'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fbsdp-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fsrc'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fsrm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fsrs'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fzrm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='la57'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='mcdt-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pbrsb-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='prefetchiti'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='psdp-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='sbdr-ssdp-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='serialize'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='taa-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='tsx-ldtrk'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vaes'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xfd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='GraniteRapids-v1'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='amx-bf16'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='amx-fp16'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='amx-int8'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='amx-tile'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx-vnni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-bf16'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-fp16'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bitalg'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512ifma'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vbmi'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fbsdp-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fsrc'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fsrm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fsrs'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fzrm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='la57'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='mcdt-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pbrsb-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='prefetchiti'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='psdp-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='sbdr-ssdp-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='serialize'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='taa-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='tsx-ldtrk'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vaes'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xfd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='GraniteRapids-v2'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='amx-bf16'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='amx-fp16'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='amx-int8'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='amx-tile'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx-vnni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx10'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx10-128'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx10-256'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx10-512'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-bf16'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-fp16'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bitalg'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512ifma'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vbmi'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='cldemote'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fbsdp-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fsrc'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fsrm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fsrs'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fzrm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='la57'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='mcdt-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='movdir64b'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='movdiri'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pbrsb-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='prefetchiti'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='psdp-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='sbdr-ssdp-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='serialize'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='ss'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='taa-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='tsx-ldtrk'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vaes'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xfd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Haswell'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Haswell-IBRS'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Haswell-noTSX'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Haswell-noTSX-IBRS'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Haswell-v1'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Haswell-v2'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Haswell-v3'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Haswell-v4'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Icelake-Server'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bitalg'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vbmi'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='la57'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vaes'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Icelake-Server-noTSX'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bitalg'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vbmi'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='la57'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vaes'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Icelake-Server-v1'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bitalg'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vbmi'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='la57'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vaes'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Icelake-Server-v2'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bitalg'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vbmi'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='la57'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vaes'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Icelake-Server-v3'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bitalg'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vbmi'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='la57'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='taa-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vaes'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Icelake-Server-v4'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bitalg'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512ifma'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vbmi'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fsrm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='la57'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='taa-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vaes'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Icelake-Server-v5'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bitalg'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512ifma'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vbmi'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fsrm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='la57'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='taa-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vaes'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Icelake-Server-v6'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bitalg'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512ifma'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vbmi'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fsrm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='la57'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='taa-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vaes'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Icelake-Server-v7'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bitalg'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512ifma'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vbmi'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fsrm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='la57'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='taa-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vaes'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='IvyBridge'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='IvyBridge-IBRS'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='IvyBridge-v1'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='IvyBridge-v2'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='KnightsMill'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-4fmaps'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-4vnniw'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512er'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512pf'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='ss'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='KnightsMill-v1'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-4fmaps'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-4vnniw'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512er'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512pf'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='ss'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Opteron_G4'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fma4'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xop'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Opteron_G4-v1'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fma4'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xop'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Opteron_G5'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fma4'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='tbm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xop'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Opteron_G5-v1'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fma4'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='tbm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xop'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='SapphireRapids'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='amx-bf16'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='amx-int8'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='amx-tile'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx-vnni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-bf16'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-fp16'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bitalg'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512ifma'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vbmi'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fsrc'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fsrm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fsrs'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fzrm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='la57'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='serialize'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='taa-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='tsx-ldtrk'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vaes'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xfd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='SapphireRapids-v1'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='amx-bf16'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='amx-int8'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='amx-tile'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx-vnni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-bf16'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-fp16'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bitalg'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512ifma'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vbmi'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fsrc'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fsrm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fsrs'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fzrm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='la57'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='serialize'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='taa-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='tsx-ldtrk'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vaes'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xfd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='SapphireRapids-v2'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='amx-bf16'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='amx-int8'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='amx-tile'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx-vnni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-bf16'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-fp16'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bitalg'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512ifma'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vbmi'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fbsdp-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fsrc'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fsrm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fsrs'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fzrm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='la57'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='psdp-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='sbdr-ssdp-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='serialize'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='taa-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='tsx-ldtrk'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vaes'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xfd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='SapphireRapids-v3'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='amx-bf16'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='amx-int8'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='amx-tile'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx-vnni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-bf16'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-fp16'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bitalg'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512ifma'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vbmi'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='cldemote'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fbsdp-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fsrc'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fsrm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fsrs'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fzrm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='la57'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='movdir64b'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='movdiri'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='psdp-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='sbdr-ssdp-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='serialize'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='ss'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='taa-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='tsx-ldtrk'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vaes'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xfd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='SierraForest'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx-ifma'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx-ne-convert'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx-vnni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx-vnni-int8'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='cmpccxadd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fbsdp-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fsrm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fsrs'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='mcdt-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pbrsb-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='psdp-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='sbdr-ssdp-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='serialize'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vaes'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='SierraForest-v1'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx-ifma'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx-ne-convert'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx-vnni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx-vnni-int8'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='cmpccxadd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fbsdp-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fsrm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='fsrs'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='mcdt-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pbrsb-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='psdp-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='sbdr-ssdp-no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='serialize'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vaes'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Skylake-Client'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Skylake-Client-IBRS'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Skylake-Client-v1'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Skylake-Client-v2'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Skylake-Client-v3'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Skylake-Client-v4'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Skylake-Server'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Skylake-Server-IBRS'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Skylake-Server-v1'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Skylake-Server-v2'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Skylake-Server-v3'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Skylake-Server-v4'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Skylake-Server-v5'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Snowridge'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='cldemote'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='core-capability'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='movdir64b'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='movdiri'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='mpx'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='split-lock-detect'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Snowridge-v1'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='cldemote'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='core-capability'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='movdir64b'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='movdiri'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='mpx'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='split-lock-detect'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Snowridge-v2'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='cldemote'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='core-capability'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='movdir64b'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='movdiri'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='split-lock-detect'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Snowridge-v3'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='cldemote'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='core-capability'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='movdir64b'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='movdiri'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='split-lock-detect'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='Snowridge-v4'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='cldemote'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='movdir64b'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='movdiri'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='athlon'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='3dnow'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='3dnowext'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='athlon-v1'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='3dnow'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='3dnowext'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='core2duo'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='ss'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='core2duo-v1'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='ss'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='coreduo'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='ss'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='coreduo-v1'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='ss'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='n270'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='ss'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='n270-v1'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='ss'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='phenom'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='3dnow'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='3dnowext'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <blockers model='phenom-v1'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='3dnow'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <feature name='3dnowext'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </blockers>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    </mode>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:  </cpu>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:  <memoryBacking supported='yes'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    <enum name='sourceType'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <value>file</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <value>anonymous</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <value>memfd</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    </enum>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:  </memoryBacking>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:  <devices>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    <disk supported='yes'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <enum name='diskDevice'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>disk</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>cdrom</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>floppy</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>lun</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </enum>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <enum name='bus'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>fdc</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>scsi</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>virtio</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>usb</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>sata</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </enum>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <enum name='model'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>virtio</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>virtio-transitional</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>virtio-non-transitional</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </enum>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    </disk>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    <graphics supported='yes'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <enum name='type'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>vnc</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>egl-headless</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>dbus</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </enum>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    </graphics>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    <video supported='yes'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <enum name='modelType'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>vga</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>cirrus</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>virtio</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>none</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>bochs</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>ramfb</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </enum>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    </video>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    <hostdev supported='yes'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <enum name='mode'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>subsystem</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </enum>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <enum name='startupPolicy'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>default</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>mandatory</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>requisite</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>optional</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </enum>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <enum name='subsysType'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>usb</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>pci</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>scsi</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </enum>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <enum name='capsType'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <enum name='pciBackend'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    </hostdev>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    <rng supported='yes'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <enum name='model'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>virtio</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>virtio-transitional</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>virtio-non-transitional</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </enum>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <enum name='backendModel'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>random</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>egd</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>builtin</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </enum>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    </rng>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    <filesystem supported='yes'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <enum name='driverType'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>path</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>handle</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>virtiofs</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </enum>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    </filesystem>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    <tpm supported='yes'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <enum name='model'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>tpm-tis</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>tpm-crb</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </enum>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <enum name='backendModel'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>emulator</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>external</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </enum>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <enum name='backendVersion'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>2.0</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </enum>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    </tpm>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    <redirdev supported='yes'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <enum name='bus'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>usb</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </enum>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    </redirdev>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    <channel supported='yes'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <enum name='type'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>pty</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>unix</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </enum>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    </channel>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    <crypto supported='yes'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <enum name='model'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <enum name='type'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>qemu</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </enum>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <enum name='backendModel'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>builtin</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </enum>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    </crypto>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    <interface supported='yes'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <enum name='backendType'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>default</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>passt</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </enum>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    </interface>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    <panic supported='yes'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <enum name='model'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>isa</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>hyperv</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </enum>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    </panic>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    <console supported='yes'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      <enum name='type'>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>null</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>vc</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>pty</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>dev</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>file</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>pipe</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>stdio</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>udp</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>tcp</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>unix</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>qemu-vdagent</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:        <value>dbus</value>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:      </enum>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    </console>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:  </devices>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:  <features>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    <gic supported='no'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    <vmcoreinfo supported='yes'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    <genid supported='yes'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    <backingStoreInput supported='yes'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    <backup supported='yes'/>
Dec  6 05:00:37 np0005548916 nova_compute[228576]:    <async-teardown supported='yes'/>
Dec  6 05:01:07 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:01:07 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4ac002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:01:07 np0005548916 rsyslogd[1007]: imjournal: 1887 messages lost due to rate-limiting (20000 allowed within 600 seconds)
Dec  6 05:01:07 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:01:07 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:01:07 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:01:07.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:01:07 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:01:07 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:01:07 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:01:07.786 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:01:07 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:01:07 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4c40041f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:01:07 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:01:07 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4c40041f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:01:08 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:01:09 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:01:09 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4dc003e40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:01:09 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:01:09 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:01:09 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:01:09.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:01:09 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:01:09 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:01:09 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:01:09.789 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:01:09 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:01:09 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4ac003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:01:09 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:01:09 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4ac003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:01:11 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:01:11 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4bc003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:01:11 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:01:11 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:01:11 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:01:11.536 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:01:11 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:01:11 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:01:11 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:01:11.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:01:11 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:01:11 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4dc003e40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:01:11 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:01:11 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4dc003e40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:01:13 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:01:13 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4c40041f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:01:13 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:01:13 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:01:13 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:01:13.539 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:01:13 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:01:13 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:01:13 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:01:13 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:01:13.796 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:01:13 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:01:13 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4bc003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:01:13 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:01:13 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4bc003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:01:15 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:01:15 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4dc003e40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:01:15 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:01:15 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:01:15 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:01:15.542 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:01:15 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:01:15 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:01:15 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:01:15.799 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:01:15 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:01:15 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4c40041f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:01:15 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:01:15 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4bc003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:01:17 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:01:17 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4ac003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:01:17 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:01:17 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:01:17 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:01:17.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:01:17 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:01:17 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:01:17 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:01:17.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:01:17 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:01:17 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4dc003e40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:01:17 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:01:17 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4c40041f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:01:18 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:01:19 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:01:19 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4bc003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:01:19 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:01:19 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:01:19 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:01:19.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:01:19 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:01:19 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:01:19 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:01:19.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:01:19 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:01:19 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4bc003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:01:20 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:01:20 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4dc003e40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:01:21 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:01:21 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4c40041f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:01:21 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:01:21 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:01:21 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:01:21.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:01:21 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:01:21 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:01:21 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:01:21.809 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:01:21 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:01:21 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4c40041f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:01:22 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:01:22 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4b0000f30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:01:23 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:01:23 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4dc003e40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:01:23 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:01:23 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:01:23 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:01:23.552 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:01:23 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:01:23 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:01:23 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:01:23 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:01:23.812 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:01:23 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:01:23 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4ac003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:01:24 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:01:24 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4c40041f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:01:25 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:01:25 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4c40041f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:01:25 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:01:25 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:01:25 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:01:25.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:01:25 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  6 05:01:25 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:01:25 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:01:25 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  6 05:01:25 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:01:25 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:01:25 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:01:25.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:01:25 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:01:25 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4dc003e40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:01:26 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:01:26 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4ac003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:01:27 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:01:27 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4b0001a90 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:01:27 np0005548916 nova_compute[228576]: 2025-12-06 10:01:27.478 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:01:27 np0005548916 nova_compute[228576]: 2025-12-06 10:01:27.517 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:01:27 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:01:27 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:01:27 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:01:27.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:01:27 np0005548916 podman[229201]: 2025-12-06 10:01:27.815091541 +0000 UTC m=+0.106267169 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  6 05:01:27 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:01:27 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:01:27 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:01:27.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:01:28 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:01:28 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4dc003e40 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:01:28 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:01:28 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4c4004210 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:01:28 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:01:29 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:01:29 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4c4004210 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:01:29 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:01:29 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:01:29 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:01:29.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:01:29 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:01:29 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:01:29 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:01:29.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:01:29 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:01:29 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4b8001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:01:30 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:01:30 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4b0001c10 fd 38 proxy ignored for local
Dec  6 05:01:30 np0005548916 kernel: ganesha.nfsd[229115]: segfault at 50 ip 00007fa58ed6532e sp 00007fa5477fd210 error 4 in libntirpc.so.5.8[7fa58ed4a000+2c000] likely on CPU 4 (core 0, socket 4)
Dec  6 05:01:30 np0005548916 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Dec  6 05:01:30 np0005548916 systemd[1]: Started Process Core Dump (PID 229255/UID 0).
Dec  6 05:01:30 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:01:30 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:01:31 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:01:31 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:01:31 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:01:31.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:01:31 np0005548916 systemd-coredump[229256]: Process 224784 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 59:#012#0  0x00007fa58ed6532e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Dec  6 05:01:31 np0005548916 systemd[1]: systemd-coredump@7-229255-0.service: Deactivated successfully.
Dec  6 05:01:31 np0005548916 systemd[1]: systemd-coredump@7-229255-0.service: Consumed 1.526s CPU time.
Dec  6 05:01:31 np0005548916 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec  6 05:01:31 np0005548916 podman[229288]: 2025-12-06 10:01:31.724887996 +0000 UTC m=+0.034846217 container died 6dc139c09dbc99a313d5333e87cc0ba0df15ffda5b12614866d45ea226e1d6ce (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2)
Dec  6 05:01:31 np0005548916 systemd[1]: var-lib-containers-storage-overlay-8c79bfeb25e587d3943a06906c158dd3f62a52f59079e06c39c4ba774c28c036-merged.mount: Deactivated successfully.
Dec  6 05:01:31 np0005548916 podman[229288]: 2025-12-06 10:01:31.76836964 +0000 UTC m=+0.078327861 container remove 6dc139c09dbc99a313d5333e87cc0ba0df15ffda5b12614866d45ea226e1d6ce (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.schema-version=1.0)
Dec  6 05:01:31 np0005548916 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.0.0.compute-1.djsnbu.service: Main process exited, code=exited, status=139/n/a
Dec  6 05:01:31 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:01:31 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:01:31 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:01:31.828 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:01:31 np0005548916 podman[229302]: 2025-12-06 10:01:31.897085233 +0000 UTC m=+0.084644574 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0)
Dec  6 05:01:31 np0005548916 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.0.0.compute-1.djsnbu.service: Failed with result 'exit-code'.
Dec  6 05:01:31 np0005548916 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.0.0.compute-1.djsnbu.service: Consumed 1.784s CPU time.
Dec  6 05:01:33 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:01:33 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:01:33 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:01:33.564 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:01:33 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:01:33 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:01:33 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:01:33 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:01:33.831 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:01:35 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:01:35 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:01:35 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:01:35.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:01:35 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:01:35 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:01:35 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:01:35.834 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:01:36 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/100136 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  6 05:01:36 np0005548916 nova_compute[228576]: 2025-12-06 10:01:36.473 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:01:36 np0005548916 nova_compute[228576]: 2025-12-06 10:01:36.474 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:01:36 np0005548916 nova_compute[228576]: 2025-12-06 10:01:36.474 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  6 05:01:36 np0005548916 nova_compute[228576]: 2025-12-06 10:01:36.474 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  6 05:01:36 np0005548916 nova_compute[228576]: 2025-12-06 10:01:36.542 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  6 05:01:36 np0005548916 nova_compute[228576]: 2025-12-06 10:01:36.543 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:01:36 np0005548916 nova_compute[228576]: 2025-12-06 10:01:36.543 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:01:36 np0005548916 nova_compute[228576]: 2025-12-06 10:01:36.543 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:01:36 np0005548916 nova_compute[228576]: 2025-12-06 10:01:36.543 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:01:36 np0005548916 nova_compute[228576]: 2025-12-06 10:01:36.544 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:01:36 np0005548916 nova_compute[228576]: 2025-12-06 10:01:36.544 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:01:36 np0005548916 nova_compute[228576]: 2025-12-06 10:01:36.544 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  6 05:01:36 np0005548916 nova_compute[228576]: 2025-12-06 10:01:36.544 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:01:36 np0005548916 nova_compute[228576]: 2025-12-06 10:01:36.610 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:01:36 np0005548916 nova_compute[228576]: 2025-12-06 10:01:36.611 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:01:36 np0005548916 nova_compute[228576]: 2025-12-06 10:01:36.611 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:01:36 np0005548916 nova_compute[228576]: 2025-12-06 10:01:36.611 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  6 05:01:36 np0005548916 nova_compute[228576]: 2025-12-06 10:01:36.612 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:01:36 np0005548916 podman[229347]: 2025-12-06 10:01:36.760518982 +0000 UTC m=+0.058689505 container health_status ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=multipathd)
Dec  6 05:01:37 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  6 05:01:37 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/244775073' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  6 05:01:37 np0005548916 nova_compute[228576]: 2025-12-06 10:01:37.075 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.463s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:01:37 np0005548916 nova_compute[228576]: 2025-12-06 10:01:37.268 228580 WARNING nova.virt.libvirt.driver [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  6 05:01:37 np0005548916 nova_compute[228576]: 2025-12-06 10:01:37.270 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5231MB free_disk=59.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  6 05:01:37 np0005548916 nova_compute[228576]: 2025-12-06 10:01:37.270 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:01:37 np0005548916 nova_compute[228576]: 2025-12-06 10:01:37.271 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:01:37 np0005548916 nova_compute[228576]: 2025-12-06 10:01:37.486 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  6 05:01:37 np0005548916 nova_compute[228576]: 2025-12-06 10:01:37.487 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  6 05:01:37 np0005548916 nova_compute[228576]: 2025-12-06 10:01:37.557 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:01:37 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:01:37 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:01:37 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:01:37.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:01:37 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:01:37 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:01:37 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:01:37.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:01:37 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  6 05:01:37 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2859480221' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  6 05:01:38 np0005548916 nova_compute[228576]: 2025-12-06 10:01:38.006 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.449s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:01:38 np0005548916 nova_compute[228576]: 2025-12-06 10:01:38.013 228580 DEBUG nova.compute.provider_tree [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Inventory has not changed in ProviderTree for provider: ff2f17cb-ff1d-4da7-9560-4be741380cb1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  6 05:01:38 np0005548916 nova_compute[228576]: 2025-12-06 10:01:38.038 228580 DEBUG nova.scheduler.client.report [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Inventory has not changed for provider ff2f17cb-ff1d-4da7-9560-4be741380cb1 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  6 05:01:38 np0005548916 nova_compute[228576]: 2025-12-06 10:01:38.040 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  6 05:01:38 np0005548916 nova_compute[228576]: 2025-12-06 10:01:38.040 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.770s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:01:38 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:01:39 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:01:39 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:01:39 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:01:39.571 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:01:39 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:01:39 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:01:39 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:01:39.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:01:41 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:01:41 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:01:41 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:01:41.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:01:41 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:01:41 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:01:41 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:01:41.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:01:42 np0005548916 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.0.0.compute-1.djsnbu.service: Scheduled restart job, restart counter is at 8.
Dec  6 05:01:42 np0005548916 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.djsnbu for 5ecd3f74-dade-5fc4-92ce-8950ae424258.
Dec  6 05:01:42 np0005548916 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.0.0.compute-1.djsnbu.service: Consumed 1.784s CPU time.
Dec  6 05:01:42 np0005548916 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.djsnbu for 5ecd3f74-dade-5fc4-92ce-8950ae424258...
Dec  6 05:01:42 np0005548916 podman[229465]: 2025-12-06 10:01:42.415586336 +0000 UTC m=+0.057204039 container create cfd84277d1dcac04a876f3b0ccbf223dd9196bdf0059805be5855adee48962d9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  6 05:01:42 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ce8c7da4624aa519272ef2c8bd30d12c947da67ff2923b4958fe16726ed31e84/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Dec  6 05:01:42 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ce8c7da4624aa519272ef2c8bd30d12c947da67ff2923b4958fe16726ed31e84/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  6 05:01:42 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ce8c7da4624aa519272ef2c8bd30d12c947da67ff2923b4958fe16726ed31e84/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Dec  6 05:01:42 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ce8c7da4624aa519272ef2c8bd30d12c947da67ff2923b4958fe16726ed31e84/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.djsnbu-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Dec  6 05:01:42 np0005548916 podman[229465]: 2025-12-06 10:01:42.477728143 +0000 UTC m=+0.119345846 container init cfd84277d1dcac04a876f3b0ccbf223dd9196bdf0059805be5855adee48962d9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_REF=squid)
Dec  6 05:01:42 np0005548916 podman[229465]: 2025-12-06 10:01:42.385588808 +0000 UTC m=+0.027206581 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  6 05:01:42 np0005548916 podman[229465]: 2025-12-06 10:01:42.488905274 +0000 UTC m=+0.130522957 container start cfd84277d1dcac04a876f3b0ccbf223dd9196bdf0059805be5855adee48962d9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325)
Dec  6 05:01:42 np0005548916 bash[229465]: cfd84277d1dcac04a876f3b0ccbf223dd9196bdf0059805be5855adee48962d9
Dec  6 05:01:42 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:01:42 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Dec  6 05:01:42 np0005548916 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.djsnbu for 5ecd3f74-dade-5fc4-92ce-8950ae424258.
Dec  6 05:01:42 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:01:42 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Dec  6 05:01:42 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:01:42 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Dec  6 05:01:42 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:01:42 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Dec  6 05:01:42 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:01:42 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Dec  6 05:01:42 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:01:42 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Dec  6 05:01:42 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:01:42 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Dec  6 05:01:42 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:01:42 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:01:43 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:01:43 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:01:43 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:01:43.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:01:43 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:01:43 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:01:43 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:01:43 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:01:43.849 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:01:45 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:01:45 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:01:45 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:01:45.578 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:01:45 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:01:45 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:01:45 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:01:45.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:01:47 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:01:47 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:01:47 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:01:47.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:01:47 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:01:47 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:01:47 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:01:47.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:01:48 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:01:48 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:01:48 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:01:48 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:01:48 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:01:49 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:01:49 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:01:49 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:01:49.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:01:49 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:01:49 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:01:49 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:01:49.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:01:51 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:01:51 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:01:51 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:01:51.585 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:01:51 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:01:51 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:01:51 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:01:51.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:01:53 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:01:53 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:01:53 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:01:53.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:01:53 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:01:53 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:01:53 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:01:53 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:01:53.868 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:01:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:01:54.273 141446 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:01:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:01:54.274 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:01:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:01:54.274 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:01:54 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:01:54 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  6 05:01:54 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:01:54 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Dec  6 05:01:54 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:01:54 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Dec  6 05:01:54 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:01:54 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Dec  6 05:01:54 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:01:54 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Dec  6 05:01:54 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:01:54 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Dec  6 05:01:54 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:01:54 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Dec  6 05:01:54 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:01:54 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  6 05:01:54 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:01:54 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  6 05:01:54 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:01:54 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  6 05:01:54 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:01:54 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Dec  6 05:01:54 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:01:54 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  6 05:01:54 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:01:54 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Dec  6 05:01:54 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:01:54 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Dec  6 05:01:54 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:01:54 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Dec  6 05:01:54 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:01:54 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Dec  6 05:01:54 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:01:54 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Dec  6 05:01:54 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:01:54 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Dec  6 05:01:54 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:01:54 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Dec  6 05:01:54 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:01:54 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Dec  6 05:01:54 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:01:54 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Dec  6 05:01:54 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:01:54 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Dec  6 05:01:54 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:01:54 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Dec  6 05:01:54 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:01:54 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Dec  6 05:01:54 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:01:54 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec  6 05:01:54 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:01:54 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Dec  6 05:01:54 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:01:54 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec  6 05:01:55 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "version", "format": "json"} v 0)
Dec  6 05:01:55 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1363450763' entity='client.openstack' cmd=[{"prefix": "version", "format": "json"}]: dispatch
Dec  6 05:01:55 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:01:55 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2248000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:01:55 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:01:55 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:01:55 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:01:55.589 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:01:55 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:01:55 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:01:55 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:01:55.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:01:55 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:01:55 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c0014d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:01:56 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:01:56 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:01:57 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:01:57 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:01:57 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:01:57 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:01:57 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:01:57.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:01:57 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:01:57 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:01:57 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:01:57.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:01:57 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:01:57 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228000fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:01:58 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:01:58 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:01:58 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/100158 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  6 05:01:58 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:01:58 np0005548916 podman[229571]: 2025-12-06 10:01:58.850315723 +0000 UTC m=+0.146305030 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Dec  6 05:01:59 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:01:59 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22240016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:01:59 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:01:59 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:01:59 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:01:59.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:01:59 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:01:59 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:01:59 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:01:59.878 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:01:59 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:01:59 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:00 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:00 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:01 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:01 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:01 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:02:01 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:02:01 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:02:01.597 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:02:01 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:02:01 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:02:01 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:02:01.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:02:01 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:01 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22240016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:02 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:02 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:02 np0005548916 podman[229599]: 2025-12-06 10:02:02.743093996 +0000 UTC m=+0.053657563 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2)
Dec  6 05:02:03 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:03 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:03 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:02:03 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:02:03 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:02:03.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:02:03 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:02:03 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:02:03 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:02:03 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:02:03.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:02:03 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:03 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:04 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:04 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22240016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:05 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:05 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:05 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:02:05 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:02:05 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:02:05.603 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:02:05 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:02:05 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:02:05 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:02:05.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:02:05 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:05 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:06 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:06 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:07 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:07 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:07 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:02:07 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:02:07 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:02:07.607 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:02:07 np0005548916 podman[229622]: 2025-12-06 10:02:07.751645797 +0000 UTC m=+0.062029196 container health_status ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  6 05:02:07 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:02:07 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:02:07 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:02:07.892 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:02:07 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:07 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:08 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:08 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228002f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:08 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:02:09 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:09 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:09 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:02:09 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:02:09 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:02:09.610 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:02:09 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:02:09 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:02:09 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:02:09.895 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:02:09 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:09 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:10 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:10 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:11 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:11 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228002f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:11 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:02:11 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:02:11 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:02:11.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:02:11 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:02:11 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:02:11 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:02:11.898 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:02:11 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:11 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:12 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:12 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:13 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:13 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003430 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:13 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:02:13 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:02:13 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:02:13.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:02:13 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:02:13 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:02:13 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:02:13 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:02:13.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:02:13 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:13 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228002f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:14 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:14 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:15 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:15 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:15 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:02:15 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:02:15 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:02:15.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:02:15 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:02:15 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:02:15 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:02:15.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:02:15 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:15 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003430 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:16 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:16 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:17 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:17 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:17 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:02:17 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:02:17 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:02:17.619 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:02:17 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:02:17 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:02:17 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:02:17.906 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:02:17 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:17 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:18 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:18 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:18 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:02:19 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:19 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:19 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:02:19 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:02:19 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:02:19.621 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:02:19 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:02:19 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:02:19 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:02:19.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:02:19 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:19 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003430 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:20 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:20 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003430 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:21 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:21 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:21 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:02:21 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:02:21 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:02:21.624 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:02:21 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:02:21 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:02:21 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:02:21.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:02:21 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:21 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:22 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:22 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003430 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:23 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:23 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:23 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:02:23 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:02:23 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:02:23.626 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:02:23 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:02:23 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:02:23 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:02:23 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:02:23.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:02:23 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:23 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:24 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:24 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:25 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:25 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003430 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:25 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:02:25 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:02:25 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:02:25.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:02:25 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:02:25 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:02:25 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:02:25.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:02:25 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:25 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:26 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:26 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:27 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:27 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:27 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:02:27 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:02:27 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:02:27.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:02:27 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:02:27 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:02:27 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:02:27.923 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:02:27 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:27 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003430 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:28 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:28 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:28 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:02:29 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:29 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:29 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:02:29 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:02:29 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:02:29.634 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:02:29 np0005548916 podman[229680]: 2025-12-06 10:02:29.796524489 +0000 UTC m=+0.103882051 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Dec  6 05:02:29 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:02:29 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:02:29 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:02:29.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:02:29 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:29 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218000d00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:30 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:30 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:31 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:31 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:31 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  6 05:02:31 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:02:31 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:02:31 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  6 05:02:31 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:02:31 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:02:31 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:02:31.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:02:31 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:02:31 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:02:31 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:02:31.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:02:31 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:31 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:32 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:32 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218001820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:33 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:33 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:33 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:02:33 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:02:33 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:02:33.638 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:02:33 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:02:33 np0005548916 podman[229814]: 2025-12-06 10:02:33.807135764 +0000 UTC m=+0.102534642 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Dec  6 05:02:33 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:02:33 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:02:33 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:02:33.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:02:34 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:34 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22400027d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:34 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:34 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:35 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:35 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218001820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:35 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:02:35 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:02:35 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:02:35.641 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:02:35 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:02:35 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:02:35 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:02:35.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:02:36 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:36 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:36 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:36 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22400027d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:36 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:02:36 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:02:37 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:37 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:37 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:02:37 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:02:37 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:02:37.644 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:02:37 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:02:37 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:02:37 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:02:37.941 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:02:38 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:38 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:38 np0005548916 nova_compute[228576]: 2025-12-06 10:02:38.030 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:02:38 np0005548916 nova_compute[228576]: 2025-12-06 10:02:38.030 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:02:38 np0005548916 nova_compute[228576]: 2025-12-06 10:02:38.050 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:02:38 np0005548916 nova_compute[228576]: 2025-12-06 10:02:38.051 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  6 05:02:38 np0005548916 nova_compute[228576]: 2025-12-06 10:02:38.051 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  6 05:02:38 np0005548916 nova_compute[228576]: 2025-12-06 10:02:38.065 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  6 05:02:38 np0005548916 nova_compute[228576]: 2025-12-06 10:02:38.066 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:02:38 np0005548916 nova_compute[228576]: 2025-12-06 10:02:38.066 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:02:38 np0005548916 nova_compute[228576]: 2025-12-06 10:02:38.067 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:02:38 np0005548916 nova_compute[228576]: 2025-12-06 10:02:38.067 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:02:38 np0005548916 nova_compute[228576]: 2025-12-06 10:02:38.067 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:02:38 np0005548916 nova_compute[228576]: 2025-12-06 10:02:38.068 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:02:38 np0005548916 nova_compute[228576]: 2025-12-06 10:02:38.068 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  6 05:02:38 np0005548916 nova_compute[228576]: 2025-12-06 10:02:38.068 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:02:38 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:38 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:38 np0005548916 nova_compute[228576]: 2025-12-06 10:02:38.102 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:02:38 np0005548916 nova_compute[228576]: 2025-12-06 10:02:38.103 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:02:38 np0005548916 nova_compute[228576]: 2025-12-06 10:02:38.103 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:02:38 np0005548916 nova_compute[228576]: 2025-12-06 10:02:38.104 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  6 05:02:38 np0005548916 nova_compute[228576]: 2025-12-06 10:02:38.104 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:02:38 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  6 05:02:38 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1539383251' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  6 05:02:38 np0005548916 nova_compute[228576]: 2025-12-06 10:02:38.542 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.438s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:02:38 np0005548916 nova_compute[228576]: 2025-12-06 10:02:38.750 228580 WARNING nova.virt.libvirt.driver [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  6 05:02:38 np0005548916 nova_compute[228576]: 2025-12-06 10:02:38.752 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5225MB free_disk=59.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  6 05:02:38 np0005548916 nova_compute[228576]: 2025-12-06 10:02:38.752 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:02:38 np0005548916 nova_compute[228576]: 2025-12-06 10:02:38.753 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:02:38 np0005548916 podman[229883]: 2025-12-06 10:02:38.767366194 +0000 UTC m=+0.067119369 container health_status ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec  6 05:02:38 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:02:38 np0005548916 nova_compute[228576]: 2025-12-06 10:02:38.883 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  6 05:02:38 np0005548916 nova_compute[228576]: 2025-12-06 10:02:38.884 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  6 05:02:38 np0005548916 nova_compute[228576]: 2025-12-06 10:02:38.929 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:02:39 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:39 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22400027d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:39 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  6 05:02:39 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1691930506' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  6 05:02:39 np0005548916 nova_compute[228576]: 2025-12-06 10:02:39.365 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.436s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:02:39 np0005548916 nova_compute[228576]: 2025-12-06 10:02:39.371 228580 DEBUG nova.compute.provider_tree [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Inventory has not changed in ProviderTree for provider: ff2f17cb-ff1d-4da7-9560-4be741380cb1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  6 05:02:39 np0005548916 nova_compute[228576]: 2025-12-06 10:02:39.389 228580 DEBUG nova.scheduler.client.report [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Inventory has not changed for provider ff2f17cb-ff1d-4da7-9560-4be741380cb1 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  6 05:02:39 np0005548916 nova_compute[228576]: 2025-12-06 10:02:39.390 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  6 05:02:39 np0005548916 nova_compute[228576]: 2025-12-06 10:02:39.391 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.638s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:02:39 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:02:39 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:02:39 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:02:39.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:02:39 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:02:39 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:02:39 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:02:39.944 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:02:40 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:40 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22180028c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:40 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:40 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22180028c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:41 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:41 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:41 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:02:41 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:02:41 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:02:41.649 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:02:41 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:02:41 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:02:41 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:02:41.948 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:02:42 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:42 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240003c60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:42 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:42 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:43 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:43 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22180028c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:43 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:02:43 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:02:43 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:02:43.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:02:43 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:02:43 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:02:43 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:02:43 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:02:43.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:02:44 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:44 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:44 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:44 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:45 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:45 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:45 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:02:45 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:02:45 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:02:45.654 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:02:45 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:02:45 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:02:45 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:02:45.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:02:46 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:46 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22180028c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:46 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:46 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240003c60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:47 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:47 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:47 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:02:47 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:02:47 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:02:47.657 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:02:47 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:02:47 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:02:47 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:02:47.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:02:48 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:48 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:48 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:48 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:48 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:02:49 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/100249 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 1ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  6 05:02:49 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:49 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:49 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:02:49 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:02:49 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:02:49.659 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:02:49 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:02:49 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:02:49 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:02:49.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:02:50 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:50 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:50 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:50 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:51 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:51 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:51 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:02:51 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:02:51 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:02:51.661 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:02:51 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:02:51 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:02:51 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:02:51.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:02:52 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:52 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:52 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:52 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:53 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:53 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:53 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:02:53 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:02:53 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:02:53.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:02:53 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:02:53 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:02:53 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:02:53 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:02:53.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:02:54 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:54 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:54 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:54 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:02:54.274 141446 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:02:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:02:54.275 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:02:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:02:54.275 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:02:55 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:55 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:55 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:02:55 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:02:55 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:02:55.666 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:02:55 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:02:55 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:02:55 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:02:55.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:02:56 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:56 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:56 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:56 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:57 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:57 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:57 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:02:57 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:02:57 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:02:57.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:02:57 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:02:57 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:02:57 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:02:57.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:02:58 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:58 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:58 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:58 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:58 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:58 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:02:58 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:02:59 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:59 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:59 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:02:59 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:02:59 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:02:59.669 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:02:59 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:02:59 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:02:59 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:02:59.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:03:00 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:00 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:00 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:00 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228002ad0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:00 np0005548916 podman[229961]: 2025-12-06 10:03:00.787534522 +0000 UTC m=+0.088815487 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec  6 05:03:01 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:01 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:01 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:01 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:03:01 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:01 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:03:01 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:03:01 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:03:01 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:03:01.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:03:01 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:03:01 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:03:01 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:03:01.982 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:03:02 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:02 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:02 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:02 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:03 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:03 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228002ad0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:03 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:03:03 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:03:03 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:03:03.674 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:03:03 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:03:03 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:03:03 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:03:03 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:03:03.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:03:04 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:04 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:04 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:04 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:04 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:04 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  6 05:03:04 np0005548916 podman[229989]: 2025-12-06 10:03:04.771392785 +0000 UTC m=+0.066064862 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec  6 05:03:05 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:05 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:05 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:03:05 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:03:05 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:03:05.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:03:05 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:03:05 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:03:05 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:03:05.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:03:06 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:06 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228002ad0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:06 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:06 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:06 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:03:06.959 141446 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=2, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '9a:dc:0d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'b6:0a:c4:b8:be:39'}, ipsec=False) old=SB_Global(nb_cfg=1) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  6 05:03:06 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:03:06.960 141446 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  6 05:03:06 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:03:06.961 141446 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=61eba479-a995-4b31-88b9-8ebfcea9907e, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '2'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  6 05:03:07 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:07 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:07 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:03:07 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:03:07 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:03:07.679 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:03:07 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:03:07 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:03:07 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:03:07.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:03:08 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:08 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:08 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:08 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228002ad0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:08 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:03:09 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:09 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:09 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:03:09 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:03:09 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:03:09.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:03:09 np0005548916 podman[230011]: 2025-12-06 10:03:09.759175416 +0000 UTC m=+0.064995837 container health_status ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=multipathd)
Dec  6 05:03:09 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:03:09 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:03:09 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:03:09.994 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:03:10 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:10 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:10 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:10 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:11 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/100311 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  6 05:03:11 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:11 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228002ad0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:11 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:03:11 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:03:11 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:03:11.685 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:03:11 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:03:11 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:03:11 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:03:11.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:03:12 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:12 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:12 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:12 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:13 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:13 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:13 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:03:13 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:03:13 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:03:13.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:03:13 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:03:14 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:03:14 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:03:14 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:03:14.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:03:14 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:14 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280038d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:14 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:14 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:15 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:15 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:15 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:03:15 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:03:15 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:03:15.691 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:03:16 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:03:16 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:03:16 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:03:16.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:03:16 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:16 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:16 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:16 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280038d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:17 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:17 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:17 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:03:17 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:03:17 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:03:17.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:03:18 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:03:18 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:03:18 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:03:18.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:03:18 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:18 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:18 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:18 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:18 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:03:19 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:19 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280038d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:19 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:03:19 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:03:19 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:03:19.696 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:03:20 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:03:20 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:03:20 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:03:20.011 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:03:20 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:20 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:20 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:20 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:21 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:21 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:21 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:03:21 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:03:21 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:03:21.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:03:22 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:03:22 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:03:22 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:03:22.015 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:03:22 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:22 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280038d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:22 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:22 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:23 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:23 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:23 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:03:23 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:03:23 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:03:23.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:03:23 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:03:24 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:03:24 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:03:24 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:03:24.018 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:03:24 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:24 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:24 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:24 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280038d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:25 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:25 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:25 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:03:25 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:03:25 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:03:25.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:03:26 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:03:26 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:03:26 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:03:26.021 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:03:26 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:26 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:26 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:26 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:27 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:27 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280038d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:27 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:03:27 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:03:27 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:03:27.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:03:28 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:03:28 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:03:28 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:03:28.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:03:28 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:28 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:28 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:28 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:28 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:03:29 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:29 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:29 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:03:29 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:03:29 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:03:29.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:03:30 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:03:30 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:03:30 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:03:30.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:03:30 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:30 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280038d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:30 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:30 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003040 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:31 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:31 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:31 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:03:31 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:03:31 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:03:31.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:03:31 np0005548916 podman[230093]: 2025-12-06 10:03:31.812915022 +0000 UTC m=+0.114226008 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Dec  6 05:03:32 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:03:32 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:03:32 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:03:32.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:03:32 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:32 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:32 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:32 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:33 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:33 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003040 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:33 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:03:33 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:03:33 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:03:33.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:03:33 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:03:34 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:03:34 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:03:34 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:03:34.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:03:34 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:34 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:34 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:34 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:35 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:35 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:35 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:03:35 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:03:35 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:03:35.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:03:35 np0005548916 podman[230123]: 2025-12-06 10:03:35.80768273 +0000 UTC m=+0.105000763 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  6 05:03:36 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:03:36 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:03:36 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:03:36.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:03:36 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:36 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003040 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:36 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:36 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:36 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  6 05:03:36 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:03:36 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:03:36 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  6 05:03:37 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:37 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:37 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:03:37 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:03:37 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:03:37.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:03:38 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:03:38 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:03:38 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:03:38.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:03:38 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:38 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:38 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:38 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003040 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:38 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:03:39 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:39 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:39 np0005548916 nova_compute[228576]: 2025-12-06 10:03:39.392 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:03:39 np0005548916 nova_compute[228576]: 2025-12-06 10:03:39.393 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  6 05:03:39 np0005548916 nova_compute[228576]: 2025-12-06 10:03:39.393 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  6 05:03:39 np0005548916 nova_compute[228576]: 2025-12-06 10:03:39.431 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  6 05:03:39 np0005548916 nova_compute[228576]: 2025-12-06 10:03:39.431 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:03:39 np0005548916 nova_compute[228576]: 2025-12-06 10:03:39.431 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:03:39 np0005548916 nova_compute[228576]: 2025-12-06 10:03:39.432 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:03:39 np0005548916 nova_compute[228576]: 2025-12-06 10:03:39.432 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:03:39 np0005548916 nova_compute[228576]: 2025-12-06 10:03:39.432 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:03:39 np0005548916 nova_compute[228576]: 2025-12-06 10:03:39.432 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  6 05:03:39 np0005548916 nova_compute[228576]: 2025-12-06 10:03:39.432 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:03:39 np0005548916 nova_compute[228576]: 2025-12-06 10:03:39.452 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:03:39 np0005548916 nova_compute[228576]: 2025-12-06 10:03:39.452 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:03:39 np0005548916 nova_compute[228576]: 2025-12-06 10:03:39.452 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:03:39 np0005548916 nova_compute[228576]: 2025-12-06 10:03:39.453 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  6 05:03:39 np0005548916 nova_compute[228576]: 2025-12-06 10:03:39.453 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:03:39 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:03:39 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:03:39 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:03:39.720 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:03:39 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  6 05:03:39 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3965991936' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  6 05:03:39 np0005548916 nova_compute[228576]: 2025-12-06 10:03:39.912 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.459s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:03:40 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:03:40 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:03:40 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:03:40.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:03:40 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:40 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:40 np0005548916 nova_compute[228576]: 2025-12-06 10:03:40.098 228580 WARNING nova.virt.libvirt.driver [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  6 05:03:40 np0005548916 nova_compute[228576]: 2025-12-06 10:03:40.099 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5260MB free_disk=59.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  6 05:03:40 np0005548916 nova_compute[228576]: 2025-12-06 10:03:40.100 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:03:40 np0005548916 nova_compute[228576]: 2025-12-06 10:03:40.100 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:03:40 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:40 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:40 np0005548916 nova_compute[228576]: 2025-12-06 10:03:40.157 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  6 05:03:40 np0005548916 nova_compute[228576]: 2025-12-06 10:03:40.158 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  6 05:03:40 np0005548916 nova_compute[228576]: 2025-12-06 10:03:40.172 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:03:40 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  6 05:03:40 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3811939761' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  6 05:03:40 np0005548916 nova_compute[228576]: 2025-12-06 10:03:40.613 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.441s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:03:40 np0005548916 nova_compute[228576]: 2025-12-06 10:03:40.620 228580 DEBUG nova.compute.provider_tree [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Inventory has not changed in ProviderTree for provider: ff2f17cb-ff1d-4da7-9560-4be741380cb1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  6 05:03:40 np0005548916 nova_compute[228576]: 2025-12-06 10:03:40.637 228580 DEBUG nova.scheduler.client.report [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Inventory has not changed for provider ff2f17cb-ff1d-4da7-9560-4be741380cb1 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  6 05:03:40 np0005548916 nova_compute[228576]: 2025-12-06 10:03:40.640 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  6 05:03:40 np0005548916 nova_compute[228576]: 2025-12-06 10:03:40.640 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.540s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:03:40 np0005548916 nova_compute[228576]: 2025-12-06 10:03:40.680 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:03:40 np0005548916 nova_compute[228576]: 2025-12-06 10:03:40.681 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:03:40 np0005548916 podman[230273]: 2025-12-06 10:03:40.778412035 +0000 UTC m=+0.074666733 container health_status ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3)
Dec  6 05:03:41 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:41 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003040 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:41 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:03:41 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:03:41 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:03:41.723 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:03:42 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:03:42 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:03:42 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:03:42.050 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:03:42 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:42 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:42 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:42 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224001ee0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:42 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:03:42 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:03:43 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:43 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:43 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:03:43 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:03:43 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:03:43.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:03:43 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:03:44 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:03:44 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:03:44 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:03:44.053 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:03:44 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:44 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003a20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:44 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:44 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:45 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:45 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224001ee0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:45 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:03:45 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:03:45 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:03:45.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:03:46 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:03:46 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:03:46 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:03:46.056 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:03:46 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:46 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:46 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:46 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:47 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:47 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:47 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:03:47 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:03:47 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:03:47.730 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:03:48 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:03:48 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:03:48 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:03:48.059 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:03:48 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:48 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224001ee0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:48 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:48 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003a20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:48 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:03:49 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:49 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:49 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:03:49 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:03:49 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:03:49.733 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:03:50 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:03:50 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:03:50 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:03:50.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:03:50 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:50 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:50 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:50 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224001ee0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:51 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:51 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003a20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:51 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:03:51 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:03:51 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:03:51.735 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:03:52 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:03:52 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:03:52 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:03:52.066 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:03:52 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:52 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:52 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:52 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:53 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:53 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224001ee0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:53 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:03:53 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:03:53 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:03:53.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:03:53 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:03:54 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:03:54 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:03:54 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:03:54.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:03:54 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:54 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003a20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:54 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:54 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:03:54.276 141446 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:03:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:03:54.277 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:03:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:03:54.277 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:03:55 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:55 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:55 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:03:55 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:03:55 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:03:55.741 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:03:56 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:03:56 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:03:56 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:03:56.072 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:03:56 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:56 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224004200 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:56 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:56 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003a20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:57 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:57 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:57 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:03:57 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:03:57 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:03:57.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:03:58 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:03:58 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:03:58 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:03:58.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:03:58 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:58 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:58 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:58 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224004200 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:58 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:03:59 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:59 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003a20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:59 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:03:59 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:03:59 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:03:59.746 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:04:00 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:04:00 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:04:00 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:04:00.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:04:00 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:00 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:00 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:00 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:01 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:01 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224004200 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:01 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:04:01 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:04:01 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:04:01.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:04:02 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:04:02 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:04:02 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:04:02.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:04:02 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:02 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003a20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:02 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:02 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c001080 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:02 np0005548916 podman[230355]: 2025-12-06 10:04:02.818181929 +0000 UTC m=+0.123200116 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec  6 05:04:03 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:03 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:03 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:04:03 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:04:03 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:04:03.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:04:03 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:04:04 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:04:04 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:04:04 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:04:04.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:04:04 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:04 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224004200 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:04 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:04 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280014d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:04 np0005548916 ceph-mon[79770]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #40. Immutable memtables: 0.
Dec  6 05:04:04 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:04:04.796432) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec  6 05:04:04 np0005548916 ceph-mon[79770]: rocksdb: [db/flush_job.cc:856] [default] [JOB 21] Flushing memtable with next log file: 40
Dec  6 05:04:04 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015444796541, "job": 21, "event": "flush_started", "num_memtables": 1, "num_entries": 2348, "num_deletes": 251, "total_data_size": 6048881, "memory_usage": 6142096, "flush_reason": "Manual Compaction"}
Dec  6 05:04:04 np0005548916 ceph-mon[79770]: rocksdb: [db/flush_job.cc:885] [default] [JOB 21] Level-0 flush table #41: started
Dec  6 05:04:04 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015444818261, "cf_name": "default", "job": 21, "event": "table_file_creation", "file_number": 41, "file_size": 3957290, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 20740, "largest_seqno": 23083, "table_properties": {"data_size": 3947958, "index_size": 5826, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2437, "raw_key_size": 19302, "raw_average_key_size": 20, "raw_value_size": 3929213, "raw_average_value_size": 4084, "num_data_blocks": 257, "num_entries": 962, "num_filter_entries": 962, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765015229, "oldest_key_time": 1765015229, "file_creation_time": 1765015444, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79d3ff52-4bd2-4fdc-8b55-33dd9c53d8e0", "db_session_id": "1TK25AVRA1WQDS4JHM8T", "orig_file_number": 41, "seqno_to_time_mapping": "N/A"}}
Dec  6 05:04:04 np0005548916 ceph-mon[79770]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 21] Flush lasted 21991 microseconds, and 9465 cpu microseconds.
Dec  6 05:04:04 np0005548916 ceph-mon[79770]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  6 05:04:04 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:04:04.818422) [db/flush_job.cc:967] [default] [JOB 21] Level-0 flush table #41: 3957290 bytes OK
Dec  6 05:04:04 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:04:04.818491) [db/memtable_list.cc:519] [default] Level-0 commit table #41 started
Dec  6 05:04:04 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:04:04.821674) [db/memtable_list.cc:722] [default] Level-0 commit table #41: memtable #1 done
Dec  6 05:04:04 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:04:04.821693) EVENT_LOG_v1 {"time_micros": 1765015444821688, "job": 21, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec  6 05:04:04 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:04:04.821711) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec  6 05:04:04 np0005548916 ceph-mon[79770]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 21] Try to delete WAL files size 6038482, prev total WAL file size 6038482, number of live WAL files 2.
Dec  6 05:04:04 np0005548916 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000037.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 05:04:04 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:04:04.823819) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031353036' seq:72057594037927935, type:22 .. '7061786F730031373538' seq:0, type:0; will stop at (end)
Dec  6 05:04:04 np0005548916 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 22] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec  6 05:04:04 np0005548916 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 21 Base level 0, inputs: [41(3864KB)], [39(13MB)]
Dec  6 05:04:04 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015444823906, "job": 22, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [41], "files_L6": [39], "score": -1, "input_data_size": 17912057, "oldest_snapshot_seqno": -1}
Dec  6 05:04:04 np0005548916 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 22] Generated table #42: 5482 keys, 15736878 bytes, temperature: kUnknown
Dec  6 05:04:04 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015444905853, "cf_name": "default", "job": 22, "event": "table_file_creation", "file_number": 42, "file_size": 15736878, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 15697398, "index_size": 24650, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 13765, "raw_key_size": 138128, "raw_average_key_size": 25, "raw_value_size": 15595288, "raw_average_value_size": 2844, "num_data_blocks": 1018, "num_entries": 5482, "num_filter_entries": 5482, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765014012, "oldest_key_time": 0, "file_creation_time": 1765015444, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79d3ff52-4bd2-4fdc-8b55-33dd9c53d8e0", "db_session_id": "1TK25AVRA1WQDS4JHM8T", "orig_file_number": 42, "seqno_to_time_mapping": "N/A"}}
Dec  6 05:04:04 np0005548916 ceph-mon[79770]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  6 05:04:04 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:04:04.906192) [db/compaction/compaction_job.cc:1663] [default] [JOB 22] Compacted 1@0 + 1@6 files to L6 => 15736878 bytes
Dec  6 05:04:04 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:04:04.907402) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 218.3 rd, 191.8 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.8, 13.3 +0.0 blob) out(15.0 +0.0 blob), read-write-amplify(8.5) write-amplify(4.0) OK, records in: 5998, records dropped: 516 output_compression: NoCompression
Dec  6 05:04:04 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:04:04.907419) EVENT_LOG_v1 {"time_micros": 1765015444907411, "job": 22, "event": "compaction_finished", "compaction_time_micros": 82039, "compaction_time_cpu_micros": 38881, "output_level": 6, "num_output_files": 1, "total_output_size": 15736878, "num_input_records": 5998, "num_output_records": 5482, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec  6 05:04:04 np0005548916 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000041.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 05:04:04 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015444908249, "job": 22, "event": "table_file_deletion", "file_number": 41}
Dec  6 05:04:04 np0005548916 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000039.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 05:04:04 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015444910991, "job": 22, "event": "table_file_deletion", "file_number": 39}
Dec  6 05:04:04 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:04:04.823747) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:04:04 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:04:04.911089) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:04:04 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:04:04.911093) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:04:04 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:04:04.911094) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:04:04 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:04:04.911096) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:04:04 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:04:04.911097) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:04:05 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:05 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c002180 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:05 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:04:05 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:04:05 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:04:05.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:04:06 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:06 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:06 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:04:06 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:04:06 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:04:06.089 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:04:06 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:06 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224004200 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:06 np0005548916 podman[230384]: 2025-12-06 10:04:06.781011457 +0000 UTC m=+0.080511895 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent)
Dec  6 05:04:07 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:07 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280014d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:07 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:04:07 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:04:07 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:04:07.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:04:08 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:08 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c002180 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:08 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:04:08 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:04:08 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:04:08.092 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:04:08 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:08 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:08 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:04:09 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:09 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224004200 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:09 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:04:09 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:04:09 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:04:09.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:04:10 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:10 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224004200 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:10 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:04:10 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:04:10 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:04:10.096 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:04:10 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:10 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c002e90 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:11 np0005548916 podman[230429]: 2025-12-06 10:04:11.010774307 +0000 UTC m=+0.055859233 container health_status ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team)
Dec  6 05:04:11 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:11 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:11 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:04:11 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:04:11 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:04:11.761 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:04:12 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:12 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224004200 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:12 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:04:12 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:04:12 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:04:12.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:04:12 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:12 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224004200 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:13 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:13 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c002e90 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:13 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:04:13 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:04:13 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:04:13.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:04:13 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:04:14 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:14 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:14 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:04:14 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:04:14 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:04:14.103 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:04:14 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:14 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:15 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:15 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:15 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:04:15 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:04:15 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:04:15.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:04:16 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:16 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c003ba0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:16 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:04:16 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:04:16 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:04:16.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:04:16 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:16 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:17 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:17 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280014d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:17 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:04:17 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:04:17 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:04:17.768 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:04:18 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:18 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224004200 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:18 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:04:18 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:04:18 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:04:18.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:04:18 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:18 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c003ba0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:18 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:04:19 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:19 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:19 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:04:19 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:04:19 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:04:19.771 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:04:20 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:20 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280014d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:20 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:04:20 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:04:20 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:04:20.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:04:20 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:20 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280014d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:21 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:21 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c003ba0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:21 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:04:21 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:04:21 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:04:21.775 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:04:22 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:22 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:22 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:04:22 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:04:22 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:04:22.112 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:04:22 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:22 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224004200 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:23 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:23 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280014d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:23 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:04:23 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:04:23 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:04:23.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:04:23 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:04:24 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:24 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c0048b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:24 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:04:24 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:04:24 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:04:24.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:04:24 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:24 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:25 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:25 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224004200 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:25 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:04:25 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:04:25 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:04:25.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:04:26 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:26 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280014d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:26 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:04:26 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:04:26 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:04:26.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:04:26 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:26 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c0048b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:27 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:27 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:27 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:04:27 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:04:27 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:04:27.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:04:28 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:28 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224004200 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:28 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:04:28 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:04:28 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:04:28.122 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:04:28 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:28 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280044a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:28 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:04:29 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:29 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c0048b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:29 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:04:29 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:04:29 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:04:29.786 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:04:30 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:30 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:30 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:04:30 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:04:30 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:04:30.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:04:30 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:30 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224004200 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:31 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:31 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280044a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:31 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:04:31 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:04:31 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:04:31.788 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:04:32 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:32 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c0048b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:32 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:04:32 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:04:32 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:04:32.128 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:04:32 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:32 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004990 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:33 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:33 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224004200 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:33 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:04:33 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:04:33 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:04:33.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:04:33 np0005548916 podman[230488]: 2025-12-06 10:04:33.815360492 +0000 UTC m=+0.117960248 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec  6 05:04:33 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:04:34 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:34 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280044a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:34 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:04:34 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:04:34 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:04:34.131 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:04:34 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:34 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280044a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:35 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:35 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004a40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:35 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:04:35 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:04:35 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:04:35.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:04:36 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:36 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224004200 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:36 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:04:36 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:04:36 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:04:36.134 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:04:36 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:36 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:36 np0005548916 nova_compute[228576]: 2025-12-06 10:04:36.470 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:04:36 np0005548916 nova_compute[228576]: 2025-12-06 10:04:36.471 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  6 05:04:37 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:37 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218001230 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:37 np0005548916 nova_compute[228576]: 2025-12-06 10:04:37.470 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:04:37 np0005548916 nova_compute[228576]: 2025-12-06 10:04:37.471 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  6 05:04:37 np0005548916 nova_compute[228576]: 2025-12-06 10:04:37.471 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  6 05:04:37 np0005548916 nova_compute[228576]: 2025-12-06 10:04:37.500 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  6 05:04:37 np0005548916 nova_compute[228576]: 2025-12-06 10:04:37.500 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:04:37 np0005548916 nova_compute[228576]: 2025-12-06 10:04:37.500 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:04:37 np0005548916 nova_compute[228576]: 2025-12-06 10:04:37.523 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:04:37 np0005548916 nova_compute[228576]: 2025-12-06 10:04:37.524 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:04:37 np0005548916 nova_compute[228576]: 2025-12-06 10:04:37.524 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:04:37 np0005548916 nova_compute[228576]: 2025-12-06 10:04:37.524 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  6 05:04:37 np0005548916 nova_compute[228576]: 2025-12-06 10:04:37.524 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:04:37 np0005548916 ceph-mon[79770]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #43. Immutable memtables: 0.
Dec  6 05:04:37 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:04:37.716397) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec  6 05:04:37 np0005548916 ceph-mon[79770]: rocksdb: [db/flush_job.cc:856] [default] [JOB 23] Flushing memtable with next log file: 43
Dec  6 05:04:37 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015477716439, "job": 23, "event": "flush_started", "num_memtables": 1, "num_entries": 532, "num_deletes": 251, "total_data_size": 842012, "memory_usage": 852240, "flush_reason": "Manual Compaction"}
Dec  6 05:04:37 np0005548916 ceph-mon[79770]: rocksdb: [db/flush_job.cc:885] [default] [JOB 23] Level-0 flush table #44: started
Dec  6 05:04:37 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015477721668, "cf_name": "default", "job": 23, "event": "table_file_creation", "file_number": 44, "file_size": 393629, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 23088, "largest_seqno": 23615, "table_properties": {"data_size": 391065, "index_size": 600, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 901, "raw_key_size": 6539, "raw_average_key_size": 19, "raw_value_size": 385955, "raw_average_value_size": 1148, "num_data_blocks": 27, "num_entries": 336, "num_filter_entries": 336, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765015445, "oldest_key_time": 1765015445, "file_creation_time": 1765015477, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79d3ff52-4bd2-4fdc-8b55-33dd9c53d8e0", "db_session_id": "1TK25AVRA1WQDS4JHM8T", "orig_file_number": 44, "seqno_to_time_mapping": "N/A"}}
Dec  6 05:04:37 np0005548916 ceph-mon[79770]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 23] Flush lasted 5327 microseconds, and 2267 cpu microseconds.
Dec  6 05:04:37 np0005548916 ceph-mon[79770]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  6 05:04:37 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:04:37.721724) [db/flush_job.cc:967] [default] [JOB 23] Level-0 flush table #44: 393629 bytes OK
Dec  6 05:04:37 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:04:37.721750) [db/memtable_list.cc:519] [default] Level-0 commit table #44 started
Dec  6 05:04:37 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:04:37.723114) [db/memtable_list.cc:722] [default] Level-0 commit table #44: memtable #1 done
Dec  6 05:04:37 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:04:37.723131) EVENT_LOG_v1 {"time_micros": 1765015477723125, "job": 23, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec  6 05:04:37 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:04:37.723169) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec  6 05:04:37 np0005548916 ceph-mon[79770]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 23] Try to delete WAL files size 838913, prev total WAL file size 838913, number of live WAL files 2.
Dec  6 05:04:37 np0005548916 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000040.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 05:04:37 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:04:37.723740) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D67727374617400353030' seq:72057594037927935, type:22 .. '6D67727374617400373532' seq:0, type:0; will stop at (end)
Dec  6 05:04:37 np0005548916 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 24] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec  6 05:04:37 np0005548916 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 23 Base level 0, inputs: [44(384KB)], [42(15MB)]
Dec  6 05:04:37 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015477723772, "job": 24, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [44], "files_L6": [42], "score": -1, "input_data_size": 16130507, "oldest_snapshot_seqno": -1}
Dec  6 05:04:37 np0005548916 podman[230528]: 2025-12-06 10:04:37.752822472 +0000 UTC m=+0.060298222 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Dec  6 05:04:37 np0005548916 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 24] Generated table #45: 5318 keys, 12216452 bytes, temperature: kUnknown
Dec  6 05:04:37 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015477988553, "cf_name": "default", "job": 24, "event": "table_file_creation", "file_number": 45, "file_size": 12216452, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12182362, "index_size": 19708, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 13317, "raw_key_size": 135099, "raw_average_key_size": 25, "raw_value_size": 12087336, "raw_average_value_size": 2272, "num_data_blocks": 802, "num_entries": 5318, "num_filter_entries": 5318, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765014012, "oldest_key_time": 0, "file_creation_time": 1765015477, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79d3ff52-4bd2-4fdc-8b55-33dd9c53d8e0", "db_session_id": "1TK25AVRA1WQDS4JHM8T", "orig_file_number": 45, "seqno_to_time_mapping": "N/A"}}
Dec  6 05:04:37 np0005548916 ceph-mon[79770]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  6 05:04:37 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:04:37 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:04:37 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:04:37.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:04:37 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:04:37.988807) [db/compaction/compaction_job.cc:1663] [default] [JOB 24] Compacted 1@0 + 1@6 files to L6 => 12216452 bytes
Dec  6 05:04:37 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:04:37.992719) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 60.9 rd, 46.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.4, 15.0 +0.0 blob) out(11.7 +0.0 blob), read-write-amplify(72.0) write-amplify(31.0) OK, records in: 5818, records dropped: 500 output_compression: NoCompression
Dec  6 05:04:37 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:04:37.992737) EVENT_LOG_v1 {"time_micros": 1765015477992728, "job": 24, "event": "compaction_finished", "compaction_time_micros": 264869, "compaction_time_cpu_micros": 28898, "output_level": 6, "num_output_files": 1, "total_output_size": 12216452, "num_input_records": 5818, "num_output_records": 5318, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec  6 05:04:37 np0005548916 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000044.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 05:04:37 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015477993071, "job": 24, "event": "table_file_deletion", "file_number": 44}
Dec  6 05:04:37 np0005548916 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000042.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 05:04:37 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015477996896, "job": 24, "event": "table_file_deletion", "file_number": 42}
Dec  6 05:04:37 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:04:37.723683) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:04:37 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:04:37.996953) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:04:37 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:04:37.996956) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:04:37 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:04:37.996958) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:04:37 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:04:37.996959) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:04:37 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:04:37.996961) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:04:38 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:38 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004a40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:38 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:04:38 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:04:38 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:04:38.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:04:38 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  6 05:04:38 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2156903250' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  6 05:04:38 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:38 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224004200 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:38 np0005548916 nova_compute[228576]: 2025-12-06 10:04:38.214 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.689s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:04:38 np0005548916 nova_compute[228576]: 2025-12-06 10:04:38.392 228580 WARNING nova.virt.libvirt.driver [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  6 05:04:38 np0005548916 nova_compute[228576]: 2025-12-06 10:04:38.395 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5253MB free_disk=59.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  6 05:04:38 np0005548916 nova_compute[228576]: 2025-12-06 10:04:38.395 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:04:38 np0005548916 nova_compute[228576]: 2025-12-06 10:04:38.395 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:04:38 np0005548916 nova_compute[228576]: 2025-12-06 10:04:38.459 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  6 05:04:38 np0005548916 nova_compute[228576]: 2025-12-06 10:04:38.460 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  6 05:04:38 np0005548916 nova_compute[228576]: 2025-12-06 10:04:38.477 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:04:38 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:04:38 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:04:38 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - - [06/Dec/2025:10:04:38.730 +0000] "GET /swift/info HTTP/1.1" 200 539 - "python-urllib3/1.26.5" - latency=0.001000024s
Dec  6 05:04:38 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:04:38 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  6 05:04:38 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2295503369' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  6 05:04:38 np0005548916 nova_compute[228576]: 2025-12-06 10:04:38.934 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.457s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:04:38 np0005548916 nova_compute[228576]: 2025-12-06 10:04:38.940 228580 DEBUG nova.compute.provider_tree [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Inventory has not changed in ProviderTree for provider: ff2f17cb-ff1d-4da7-9560-4be741380cb1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  6 05:04:38 np0005548916 nova_compute[228576]: 2025-12-06 10:04:38.961 228580 DEBUG nova.scheduler.client.report [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Inventory has not changed for provider ff2f17cb-ff1d-4da7-9560-4be741380cb1 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  6 05:04:38 np0005548916 nova_compute[228576]: 2025-12-06 10:04:38.963 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  6 05:04:38 np0005548916 nova_compute[228576]: 2025-12-06 10:04:38.963 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.568s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:04:39 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:39 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:39 np0005548916 nova_compute[228576]: 2025-12-06 10:04:39.934 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:04:39 np0005548916 nova_compute[228576]: 2025-12-06 10:04:39.961 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:04:39 np0005548916 nova_compute[228576]: 2025-12-06 10:04:39.961 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:04:39 np0005548916 nova_compute[228576]: 2025-12-06 10:04:39.961 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:04:39 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:04:39 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:04:39 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:04:39.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:04:40 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:40 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218002140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:40 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:04:40 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:04:40 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:04:40.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:04:40 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:40 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218002140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:40 np0005548916 nova_compute[228576]: 2025-12-06 10:04:40.489 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:04:41 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:41 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218002140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:41 np0005548916 nova_compute[228576]: 2025-12-06 10:04:41.470 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:04:41 np0005548916 podman[230582]: 2025-12-06 10:04:41.755249777 +0000 UTC m=+0.061122712 container health_status ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3)
Dec  6 05:04:41 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:04:41 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:04:41 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:04:41.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:04:42 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:42 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:42 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:04:42 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:04:42 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:04:42.143 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:04:42 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:42 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004a40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:42 np0005548916 podman[230725]: 2025-12-06 10:04:42.319827718 +0000 UTC m=+0.066587794 container exec 500f8c89b5c281d0ace139835b07ea5bc6259ce3e028d8284d57da97424b55e0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-crash-compute-1, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True)
Dec  6 05:04:42 np0005548916 podman[230725]: 2025-12-06 10:04:42.415171334 +0000 UTC m=+0.161931430 container exec_died 500f8c89b5c281d0ace139835b07ea5bc6259ce3e028d8284d57da97424b55e0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-crash-compute-1, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  6 05:04:42 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Dec  6 05:04:42 np0005548916 podman[230839]: 2025-12-06 10:04:42.914264631 +0000 UTC m=+0.058245086 container exec 6af22af7046e22bedbb2fb280e4d2c530c5b3cac3959f396bf7fe3d14752a7eb (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec  6 05:04:42 np0005548916 podman[230839]: 2025-12-06 10:04:42.927528031 +0000 UTC m=+0.071508476 container exec_died 6af22af7046e22bedbb2fb280e4d2c530c5b3cac3959f396bf7fe3d14752a7eb (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec  6 05:04:43 np0005548916 podman[230928]: 2025-12-06 10:04:43.243389139 +0000 UTC m=+0.052445746 container exec cfd84277d1dcac04a876f3b0ccbf223dd9196bdf0059805be5855adee48962d9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  6 05:04:43 np0005548916 podman[230928]: 2025-12-06 10:04:43.257707414 +0000 UTC m=+0.066764031 container exec_died cfd84277d1dcac04a876f3b0ccbf223dd9196bdf0059805be5855adee48962d9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, ceph=True, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1)
Dec  6 05:04:43 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:43 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224004200 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:43 np0005548916 podman[230993]: 2025-12-06 10:04:43.453530017 +0000 UTC m=+0.047496506 container exec 70891cd2190622057f9c45299e27938f7b2105f0244eda3658dedfb18fed50f0 (image=quay.io/ceph/haproxy:2.3, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd)
Dec  6 05:04:43 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e147 e147: 3 total, 3 up, 3 in
Dec  6 05:04:43 np0005548916 podman[230993]: 2025-12-06 10:04:43.500683554 +0000 UTC m=+0.094650033 container exec_died 70891cd2190622057f9c45299e27938f7b2105f0244eda3658dedfb18fed50f0 (image=quay.io/ceph/haproxy:2.3, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd)
Dec  6 05:04:43 np0005548916 podman[231060]: 2025-12-06 10:04:43.693564816 +0000 UTC m=+0.049147126 container exec c8ec7212805c01399bc295ce2c5e69b11fbde393e887859b5ab336e81cd6d1f1 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-1-uzbtlt, architecture=x86_64, name=keepalived, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, build-date=2023-02-22T09:23:20, version=2.2.4, com.redhat.component=keepalived-container, io.k8s.display-name=Keepalived on RHEL 9, summary=Provides keepalived on RHEL 9 for Ceph., description=keepalived for Ceph, io.buildah.version=1.28.2, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-type=git, io.openshift.tags=Ceph keepalived, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1793, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9)
Dec  6 05:04:43 np0005548916 podman[231060]: 2025-12-06 10:04:43.707489262 +0000 UTC m=+0.063071562 container exec_died c8ec7212805c01399bc295ce2c5e69b11fbde393e887859b5ab336e81cd6d1f1 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-1-uzbtlt, io.openshift.expose-services=, com.redhat.component=keepalived-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.display-name=Keepalived on RHEL 9, release=1793, vcs-type=git, name=keepalived, io.buildah.version=1.28.2, summary=Provides keepalived on RHEL 9 for Ceph., vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.openshift.tags=Ceph keepalived, architecture=x86_64, build-date=2023-02-22T09:23:20, description=keepalived for Ceph, vendor=Red Hat, Inc., version=2.2.4, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793)
Dec  6 05:04:43 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:04:43 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:04:43 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:04:43 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:04:43.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:04:44 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:44 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218002140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:44 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:04:44 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:04:44 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:04:44.147 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:04:44 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:44 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:44 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:04:44 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:04:44 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:04:44 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:04:44 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e148 e148: 3 total, 3 up, 3 in
Dec  6 05:04:45 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:45 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004a40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:45 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Dec  6 05:04:45 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Dec  6 05:04:45 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  6 05:04:45 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:04:45 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:04:45 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  6 05:04:45 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e149 e149: 3 total, 3 up, 3 in
Dec  6 05:04:46 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:04:46 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:04:46 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:04:45.999 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:04:46 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:46 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004a40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:46 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:04:46 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:04:46 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:04:46.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:04:46 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:46 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004a40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:47 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:47 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:48 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:04:48 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:04:48 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:04:48.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:04:48 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:48 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004a40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:48 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:04:48 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:04:48 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:04:48.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:04:48 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e150 e150: 3 total, 3 up, 3 in
Dec  6 05:04:48 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:48 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218002140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:48 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:04:49 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:49 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224004200 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:50 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:04:50 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:04:50 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:04:50.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:04:50 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:50 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:50 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:04:50 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:04:50 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:04:50.157 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:04:50 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:50 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004a40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:50 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:04:50 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:04:51 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:51 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218003dd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:52 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:04:52 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:04:52 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:04:52.011 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:04:52 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:52 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224004200 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:52 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:04:52 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:04:52 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:04:52.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:04:52 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:52 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:52 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e151 e151: 3 total, 3 up, 3 in
Dec  6 05:04:53 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:53 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004a40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:53 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:04:54 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:04:54 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:04:54 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:04:54.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:04:54 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:54 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218003dd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:54 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:04:54 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:04:54 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:04:54.163 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:04:54 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:54 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224004200 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:04:54.278 141446 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:04:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:04:54.279 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:04:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:04:54.279 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:04:55 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:55 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:56 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:04:56 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:04:56 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:04:56.018 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:04:56 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:56 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004a40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:56 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:04:56 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:04:56 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:04:56.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:04:56 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:56 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218003dd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:57 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:57 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224004200 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:58 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:04:58 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:04:58 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:04:58.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:04:58 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:58 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:58 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:04:58 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:04:58 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:04:58.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:04:58 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:58 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004a40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:58 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:04:59 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:59 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218003dd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:00 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:05:00 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:05:00 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:05:00.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:05:00 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:00 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224004200 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:00 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:05:00 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:05:00 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:05:00.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:05:00 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:00 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:01 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:01 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004a40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:02 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:05:02 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:05:02 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:05:02.026 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:05:02 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:02 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218003dd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:02 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:05:02 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:05:02 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:05:02.176 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:05:02 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:02 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218003dd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:03 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:03 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:03 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:05:04 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:05:04 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:05:04 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:05:04.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:05:04 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:04 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004be0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:04 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:05:04 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:05:04 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:05:04.180 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:05:04 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:04 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218003dd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:04 np0005548916 podman[231235]: 2025-12-06 10:05:04.823247206 +0000 UTC m=+0.115002824 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Dec  6 05:05:05 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:05 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218003dd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:06 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:05:06 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:05:06 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:05:06.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:05:06 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:06 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:06 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:05:06 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:05:06 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:05:06.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:05:06 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:06 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c001b90 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:07 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:07 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224004200 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:08 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:05:08 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:05:08.038 141446 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=3, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '9a:dc:0d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'b6:0a:c4:b8:be:39'}, ipsec=False) old=SB_Global(nb_cfg=2) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  6 05:05:08 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:05:08 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:05:08.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:05:08 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:05:08.039 141446 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  6 05:05:08 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:08 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224004200 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:08 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:05:08 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:05:08 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:05:08.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:05:08 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:08 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228001fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:08 np0005548916 podman[231265]: 2025-12-06 10:05:08.76734588 +0000 UTC m=+0.071161747 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  6 05:05:08 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:05:09 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:09 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c001b90 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:10 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:05:10 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:05:10 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:05:10.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:05:10 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:10 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c001b90 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:10 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:05:10 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:05:10 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:05:10.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:05:10 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:10 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c001b90 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:10 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/100510 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  6 05:05:11 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:11 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218003dd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:12 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:05:12 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:05:12 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:05:12.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:05:12 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:12 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224004200 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:12 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:05:12 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:05:12 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:05:12.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:05:12 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:12 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228001fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:12 np0005548916 podman[231312]: 2025-12-06 10:05:12.771059992 +0000 UTC m=+0.079605201 container health_status ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3, tcib_managed=true)
Dec  6 05:05:13 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:13 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c001b90 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:13 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:05:14 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:05:14 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:05:14 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:05:14.048 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:05:14 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:14 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218003dd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:14 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:05:14 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:05:14 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:05:14.198 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:05:14 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:14 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224004200 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:15 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/100515 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  6 05:05:15 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:15 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228001fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:16 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:05:16 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:05:16 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:05:16.051 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:05:16 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:16 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c001b90 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:16 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:05:16 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:05:16 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:05:16.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:05:16 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:16 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218003dd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:17 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:05:17.041 141446 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=61eba479-a995-4b31-88b9-8ebfcea9907e, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '3'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  6 05:05:17 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:17 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224004200 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:18 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:05:18 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:05:18 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:05:18.054 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:05:18 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:18 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228001fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:18 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:05:18 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:05:18 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:05:18.204 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:05:18 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:18 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c004520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:18 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:05:19 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:19 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218003dd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:20 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:05:20 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:05:20 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:05:20.057 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:05:20 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:20 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:05:20 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:20 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224004200 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:20 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:05:20 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:05:20 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:05:20.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:05:20 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:20 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228001fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:21 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:21 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c004520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:22 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:05:22 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:05:22 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:05:22.060 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:05:22 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:22 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218003dd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:22 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:05:22 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:05:22 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:05:22.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:05:22 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:22 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224004200 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:23 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:23 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:05:23 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:23 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:05:23 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:23 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228001fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:23 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:23 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:05:23 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:05:24 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:05:24 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:05:24 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:05:24.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:05:24 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:24 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c004520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:24 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:05:24 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:05:24 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:05:24.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:05:24 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:24 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218003dd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:25 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:25 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224004200 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:26 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:05:26 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:05:26 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:05:26.066 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:05:26 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:26 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228001fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:26 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:05:26 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:05:26 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:05:26.218 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:05:26 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:26 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c004520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:26 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:26 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  6 05:05:27 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:27 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218003dd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:28 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:05:28 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:05:28 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:05:28.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:05:28 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:28 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224004220 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:28 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:05:28 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:05:28 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:05:28.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:05:28 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:28 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218003dd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:28 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:05:29 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:29 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228001fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:29 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:29 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:05:30 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:05:30 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:05:30 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:05:30.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:05:30 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:30 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c004520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:30 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:05:30 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:05:30 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:05:30.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:05:30 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:30 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224004240 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:31 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:31 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218003dd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:32 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:05:32 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:05:32 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:05:32.076 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:05:32 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:32 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228001fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:32 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:05:32 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:05:32 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:05:32.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:05:32 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:32 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c004520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:32 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/100532 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  6 05:05:32 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:32 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:05:32 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:32 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:05:33 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:33 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224004260 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:33 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:05:34 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:05:34 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:05:34 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:05:34.080 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:05:34 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:34 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218003dd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:34 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:05:34 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:05:34 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:05:34.231 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:05:34 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:34 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228001fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:35 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:35 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  6 05:05:35 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:35 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c004520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:35 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e152 e152: 3 total, 3 up, 3 in
Dec  6 05:05:35 np0005548916 podman[231371]: 2025-12-06 10:05:35.847972234 +0000 UTC m=+0.141137805 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec  6 05:05:36 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:05:36 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:05:36 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:05:36.083 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:05:36 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:36 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224004280 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:36 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:05:36 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:05:36 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:05:36.235 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:05:36 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:36 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22180046f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:36 np0005548916 nova_compute[228576]: 2025-12-06 10:05:36.471 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:05:36 np0005548916 nova_compute[228576]: 2025-12-06 10:05:36.471 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  6 05:05:36 np0005548916 nova_compute[228576]: 2025-12-06 10:05:36.472 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:05:36 np0005548916 nova_compute[228576]: 2025-12-06 10:05:36.472 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Dec  6 05:05:36 np0005548916 nova_compute[228576]: 2025-12-06 10:05:36.502 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Dec  6 05:05:36 np0005548916 nova_compute[228576]: 2025-12-06 10:05:36.503 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:05:36 np0005548916 nova_compute[228576]: 2025-12-06 10:05:36.504 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Dec  6 05:05:36 np0005548916 nova_compute[228576]: 2025-12-06 10:05:36.518 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:05:36 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e153 e153: 3 total, 3 up, 3 in
Dec  6 05:05:37 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/100537 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  6 05:05:37 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:37 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22180046f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:37 np0005548916 nova_compute[228576]: 2025-12-06 10:05:37.526 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:05:37 np0005548916 nova_compute[228576]: 2025-12-06 10:05:37.527 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  6 05:05:37 np0005548916 nova_compute[228576]: 2025-12-06 10:05:37.527 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  6 05:05:37 np0005548916 nova_compute[228576]: 2025-12-06 10:05:37.546 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  6 05:05:37 np0005548916 nova_compute[228576]: 2025-12-06 10:05:37.547 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:05:38 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:05:38 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:05:38 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:05:38.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:05:38 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:38 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c004520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:38 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:05:38 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:05:38 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:05:38.237 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:05:38 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:38 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004190 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:38 np0005548916 nova_compute[228576]: 2025-12-06 10:05:38.470 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:05:38 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e153 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:05:39 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:39 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22180046f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:39 np0005548916 nova_compute[228576]: 2025-12-06 10:05:39.469 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:05:39 np0005548916 nova_compute[228576]: 2025-12-06 10:05:39.470 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:05:39 np0005548916 nova_compute[228576]: 2025-12-06 10:05:39.492 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:05:39 np0005548916 nova_compute[228576]: 2025-12-06 10:05:39.492 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:05:39 np0005548916 nova_compute[228576]: 2025-12-06 10:05:39.493 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:05:39 np0005548916 nova_compute[228576]: 2025-12-06 10:05:39.493 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  6 05:05:39 np0005548916 nova_compute[228576]: 2025-12-06 10:05:39.493 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:05:39 np0005548916 podman[231423]: 2025-12-06 10:05:39.757778631 +0000 UTC m=+0.061746820 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec  6 05:05:39 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  6 05:05:39 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3018802899' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  6 05:05:39 np0005548916 nova_compute[228576]: 2025-12-06 10:05:39.994 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.501s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:05:40 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:05:40 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:05:40 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:05:40.089 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:05:40 np0005548916 nova_compute[228576]: 2025-12-06 10:05:40.142 228580 WARNING nova.virt.libvirt.driver [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  6 05:05:40 np0005548916 nova_compute[228576]: 2025-12-06 10:05:40.143 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5250MB free_disk=59.967525482177734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  6 05:05:40 np0005548916 nova_compute[228576]: 2025-12-06 10:05:40.144 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:05:40 np0005548916 nova_compute[228576]: 2025-12-06 10:05:40.144 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:05:40 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:40 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228001fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:40 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:05:40 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:05:40 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:05:40.240 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:05:40 np0005548916 nova_compute[228576]: 2025-12-06 10:05:40.246 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  6 05:05:40 np0005548916 nova_compute[228576]: 2025-12-06 10:05:40.247 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  6 05:05:40 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:40 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228001fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:40 np0005548916 nova_compute[228576]: 2025-12-06 10:05:40.290 228580 DEBUG nova.scheduler.client.report [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Refreshing inventories for resource provider ff2f17cb-ff1d-4da7-9560-4be741380cb1 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Dec  6 05:05:40 np0005548916 nova_compute[228576]: 2025-12-06 10:05:40.373 228580 DEBUG nova.scheduler.client.report [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Updating ProviderTree inventory for provider ff2f17cb-ff1d-4da7-9560-4be741380cb1 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Dec  6 05:05:40 np0005548916 nova_compute[228576]: 2025-12-06 10:05:40.374 228580 DEBUG nova.compute.provider_tree [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Updating inventory in ProviderTree for provider ff2f17cb-ff1d-4da7-9560-4be741380cb1 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Dec  6 05:05:40 np0005548916 nova_compute[228576]: 2025-12-06 10:05:40.396 228580 DEBUG nova.scheduler.client.report [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Refreshing aggregate associations for resource provider ff2f17cb-ff1d-4da7-9560-4be741380cb1, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Dec  6 05:05:40 np0005548916 nova_compute[228576]: 2025-12-06 10:05:40.426 228580 DEBUG nova.scheduler.client.report [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Refreshing trait associations for resource provider ff2f17cb-ff1d-4da7-9560-4be741380cb1, traits: COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_ACCELERATORS,HW_CPU_X86_SVM,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_AESNI,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSSE3,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSE4A,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_FMA3,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_AVX,HW_CPU_X86_BMI2,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SHA,HW_CPU_X86_SSE2,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_AMD_SVM,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_F16C,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_ABM,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NODE,COMPUTE_SECURITY_TPM_2_0,COMPUTE_RESCUE_BFV,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SECURITY_TPM_1_2,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_CLMUL,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_AVX2,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_BMI,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_SOCKET_PCI_NUMA_AFFINITY _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Dec  6 05:05:40 np0005548916 nova_compute[228576]: 2025-12-06 10:05:40.446 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:05:40 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  6 05:05:40 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2320957764' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  6 05:05:40 np0005548916 nova_compute[228576]: 2025-12-06 10:05:40.931 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.485s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:05:40 np0005548916 nova_compute[228576]: 2025-12-06 10:05:40.937 228580 DEBUG nova.compute.provider_tree [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Inventory has not changed in ProviderTree for provider: ff2f17cb-ff1d-4da7-9560-4be741380cb1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  6 05:05:40 np0005548916 nova_compute[228576]: 2025-12-06 10:05:40.950 228580 DEBUG nova.scheduler.client.report [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Inventory has not changed for provider ff2f17cb-ff1d-4da7-9560-4be741380cb1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  6 05:05:40 np0005548916 nova_compute[228576]: 2025-12-06 10:05:40.951 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  6 05:05:40 np0005548916 nova_compute[228576]: 2025-12-06 10:05:40.952 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.807s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:05:41 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:41 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004190 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:41 np0005548916 nova_compute[228576]: 2025-12-06 10:05:41.953 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:05:42 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:05:42 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:05:42 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:05:42.092 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:05:42 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:42 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22180046f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:42 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:05:42 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:05:42 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:05:42.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:05:42 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:42 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c004520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:42 np0005548916 nova_compute[228576]: 2025-12-06 10:05:42.463 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:05:42 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 e154: 3 total, 3 up, 3 in
Dec  6 05:05:43 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:43 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228001fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:43 np0005548916 nova_compute[228576]: 2025-12-06 10:05:43.470 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:05:43 np0005548916 podman[231468]: 2025-12-06 10:05:43.757057967 +0000 UTC m=+0.062444177 container health_status ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  6 05:05:43 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:05:44 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:05:44 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:05:44 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:05:44.096 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:05:44 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:44 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22400023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:44 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:05:44 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:05:44 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:05:44.246 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:05:44 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:44 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22180046f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:45 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:45 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c004520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:46 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:05:46 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:05:46 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:05:46.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:05:46 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:46 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228001fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:46 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:05:46 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:05:46 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:05:46.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:05:46 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:46 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22400023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:47 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:47 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22180046f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:48 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:05:48 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:05:48 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:05:48.103 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:05:48 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:48 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c004520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:48 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:05:48 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:05:48 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:05:48.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:05:48 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:48 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228001fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:48 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:05:49 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:49 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228001fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:50 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:05:50 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:05:50 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:05:50.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:05:50 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:50 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22180046f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:50 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:05:50 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:05:50 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:05:50.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:05:50 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:50 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c004520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:51 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:51 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c004520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:52 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:05:52 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:05:52 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:05:52.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:05:52 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:52 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22400023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:52 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:05:52 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:05:52 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:05:52.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:05:52 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:52 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22180046f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:52 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/100552 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  6 05:05:52 np0005548916 ceph-mon[79770]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #46. Immutable memtables: 0.
Dec  6 05:05:52 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:05:52.795525) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec  6 05:05:52 np0005548916 ceph-mon[79770]: rocksdb: [db/flush_job.cc:856] [default] [JOB 25] Flushing memtable with next log file: 46
Dec  6 05:05:52 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015552795571, "job": 25, "event": "flush_started", "num_memtables": 1, "num_entries": 1149, "num_deletes": 256, "total_data_size": 2543773, "memory_usage": 2579736, "flush_reason": "Manual Compaction"}
Dec  6 05:05:52 np0005548916 ceph-mon[79770]: rocksdb: [db/flush_job.cc:885] [default] [JOB 25] Level-0 flush table #47: started
Dec  6 05:05:52 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015552807890, "cf_name": "default", "job": 25, "event": "table_file_creation", "file_number": 47, "file_size": 1680401, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 23620, "largest_seqno": 24764, "table_properties": {"data_size": 1675158, "index_size": 2703, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1477, "raw_key_size": 11296, "raw_average_key_size": 19, "raw_value_size": 1664427, "raw_average_value_size": 2864, "num_data_blocks": 118, "num_entries": 581, "num_filter_entries": 581, "num_deletions": 256, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765015477, "oldest_key_time": 1765015477, "file_creation_time": 1765015552, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79d3ff52-4bd2-4fdc-8b55-33dd9c53d8e0", "db_session_id": "1TK25AVRA1WQDS4JHM8T", "orig_file_number": 47, "seqno_to_time_mapping": "N/A"}}
Dec  6 05:05:52 np0005548916 ceph-mon[79770]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 25] Flush lasted 12428 microseconds, and 6182 cpu microseconds.
Dec  6 05:05:52 np0005548916 ceph-mon[79770]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  6 05:05:52 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:05:52.807953) [db/flush_job.cc:967] [default] [JOB 25] Level-0 flush table #47: 1680401 bytes OK
Dec  6 05:05:52 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:05:52.807980) [db/memtable_list.cc:519] [default] Level-0 commit table #47 started
Dec  6 05:05:52 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:05:52.809488) [db/memtable_list.cc:722] [default] Level-0 commit table #47: memtable #1 done
Dec  6 05:05:52 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:05:52.809501) EVENT_LOG_v1 {"time_micros": 1765015552809497, "job": 25, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec  6 05:05:52 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:05:52.809522) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec  6 05:05:52 np0005548916 ceph-mon[79770]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 25] Try to delete WAL files size 2538114, prev total WAL file size 2538114, number of live WAL files 2.
Dec  6 05:05:52 np0005548916 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000043.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 05:05:52 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:05:52.810335) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D00323532' seq:72057594037927935, type:22 .. '6C6F676D00353034' seq:0, type:0; will stop at (end)
Dec  6 05:05:52 np0005548916 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 26] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec  6 05:05:52 np0005548916 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 25 Base level 0, inputs: [47(1641KB)], [45(11MB)]
Dec  6 05:05:52 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015552810427, "job": 26, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [47], "files_L6": [45], "score": -1, "input_data_size": 13896853, "oldest_snapshot_seqno": -1}
Dec  6 05:05:52 np0005548916 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 26] Generated table #48: 5365 keys, 13714802 bytes, temperature: kUnknown
Dec  6 05:05:52 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015552872926, "cf_name": "default", "job": 26, "event": "table_file_creation", "file_number": 48, "file_size": 13714802, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13678587, "index_size": 21705, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 13445, "raw_key_size": 137354, "raw_average_key_size": 25, "raw_value_size": 13580964, "raw_average_value_size": 2531, "num_data_blocks": 884, "num_entries": 5365, "num_filter_entries": 5365, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765014012, "oldest_key_time": 0, "file_creation_time": 1765015552, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79d3ff52-4bd2-4fdc-8b55-33dd9c53d8e0", "db_session_id": "1TK25AVRA1WQDS4JHM8T", "orig_file_number": 48, "seqno_to_time_mapping": "N/A"}}
Dec  6 05:05:52 np0005548916 ceph-mon[79770]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  6 05:05:52 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:05:52.873279) [db/compaction/compaction_job.cc:1663] [default] [JOB 26] Compacted 1@0 + 1@6 files to L6 => 13714802 bytes
Dec  6 05:05:52 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:05:52.874844) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 222.1 rd, 219.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.6, 11.7 +0.0 blob) out(13.1 +0.0 blob), read-write-amplify(16.4) write-amplify(8.2) OK, records in: 5899, records dropped: 534 output_compression: NoCompression
Dec  6 05:05:52 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:05:52.874863) EVENT_LOG_v1 {"time_micros": 1765015552874855, "job": 26, "event": "compaction_finished", "compaction_time_micros": 62581, "compaction_time_cpu_micros": 26986, "output_level": 6, "num_output_files": 1, "total_output_size": 13714802, "num_input_records": 5899, "num_output_records": 5365, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec  6 05:05:52 np0005548916 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000047.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 05:05:52 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015552875318, "job": 26, "event": "table_file_deletion", "file_number": 47}
Dec  6 05:05:52 np0005548916 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000045.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 05:05:52 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015552877651, "job": 26, "event": "table_file_deletion", "file_number": 45}
Dec  6 05:05:52 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:05:52.810273) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:05:52 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:05:52.877747) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:05:52 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:05:52.877754) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:05:52 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:05:52.877756) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:05:52 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:05:52.877758) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:05:52 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:05:52.877760) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:05:53 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:53 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228001fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:53 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:05:54 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:05:54 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:05:54 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:05:54.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:05:54 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:54 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c004520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:54 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:05:54 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:05:54 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:05:54.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:05:54 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:54 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22400023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:05:54.279 141446 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:05:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:05:54.280 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:05:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:05:54.280 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:05:54 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:05:54 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:05:54 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  6 05:05:54 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:05:54 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:05:54 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  6 05:05:54 np0005548916 ceph-osd[77465]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #43. Immutable memtables: 0.
Dec  6 05:05:55 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:55 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22400023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:55 np0005548916 ceph-mon[79770]: Health check failed: 1 failed cephadm daemon(s) (CEPHADM_FAILED_DAEMON)
Dec  6 05:05:56 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:05:56 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:05:56 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:05:56.114 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:05:56 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:56 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228001fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:56 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:05:56 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:05:56 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:05:56.264 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:05:56 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:56 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c004520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:57 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:57 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22400023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:58 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:05:58 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:05:58 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:05:58.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:05:58 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:58 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22180046f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:58 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:05:58 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:05:58 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:05:58.267 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:05:58 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:58 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22180046f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:58 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:05:59 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:59 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c004520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:00 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:06:00 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:06:00 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:06:00.121 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:06:00 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:00 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22400023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:00 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:06:00 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:00 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22400023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:00 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:06:00 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:06:00.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:06:00 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:00 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:06:01 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:01 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228001fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:01 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:06:01 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:06:02 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:06:02 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:06:02 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:06:02.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:06:02 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:02 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c004520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:02 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:02 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228001fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:02 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:06:02 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:06:02 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:06:02.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:06:03 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:03 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22180046f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:03 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:06:03 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:03 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:06:03 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:03 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:06:04 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:06:04 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:06:04 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:06:04.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:06:04 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:04 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22400023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:04 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:04 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228004890 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:04 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:06:04 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:06:04 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:06:04.276 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:06:05 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:05 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c004520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:06 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:06:06 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:06:06 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:06:06.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:06:06 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:06 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22180046f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:06 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:06 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22400023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:06 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:06:06 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:06:06 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:06:06.279 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:06:06 np0005548916 podman[231634]: 2025-12-06 10:06:06.788121574 +0000 UTC m=+0.085263728 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  6 05:06:06 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:06 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  6 05:06:07 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:07 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228004890 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:08 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:06:08.105 141446 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=4, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '9a:dc:0d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'b6:0a:c4:b8:be:39'}, ipsec=False) old=SB_Global(nb_cfg=3) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  6 05:06:08 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:06:08.107 141446 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  6 05:06:08 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:06:08.108 141446 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=61eba479-a995-4b31-88b9-8ebfcea9907e, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '4'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  6 05:06:08 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:06:08 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:06:08 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:06:08.135 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:06:08 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:08 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c004520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:08 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:08 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c004520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:08 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:06:08 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:06:08 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:06:08.282 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:06:08 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:06:09 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:06:09 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:09 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22400023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:10 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:06:10 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:06:10 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:06:10.138 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:06:10 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:10 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228004890 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:10 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:10 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224002550 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:10 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:06:10 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:06:10 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:06:10.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:06:10 np0005548916 podman[231665]: 2025-12-06 10:06:10.737117247 +0000 UTC m=+0.047280152 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  6 05:06:11 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:11 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:12 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:06:12 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:06:12 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:06:12.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:06:12 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:12 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22400023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:12 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:12 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228004890 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:12 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/100612 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  6 05:06:12 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:06:12 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:06:12 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:06:12.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:06:13 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:13 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224002550 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:13 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:06:14 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:06:14 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:06:14 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:06:14.145 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:06:14 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:14 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:14 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:14 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22400023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:14 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:06:14 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:06:14 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:06:14.292 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:06:14 np0005548916 podman[231712]: 2025-12-06 10:06:14.779113722 +0000 UTC m=+0.076885866 container health_status ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible)
Dec  6 05:06:15 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:15 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228004890 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:16 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:06:16 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:06:16 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:06:16.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:06:16 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:16 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224002550 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:16 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:16 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:16 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:06:16 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:06:16 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:06:16.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:06:17 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:17 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22400023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:18 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:06:18 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:06:18 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:06:18.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:06:18 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:18 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228004890 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:18 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:18 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224002550 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:18 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:06:18 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:06:18 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:06:18.298 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:06:18 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:06:19 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:19 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:20 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:06:20 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:06:20 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:06:20.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:06:20 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:20 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22400023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:20 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:20 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228004890 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:20 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:06:20 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:06:20 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:06:20.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:06:21 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:21 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:22 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:06:22 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:06:22 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:06:22.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:06:22 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:22 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224003a20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:22 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:22 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22400023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:22 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:06:22 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:06:22 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:06:22.303 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:06:23 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:23 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228004890 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:23 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:06:24 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:06:24 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:06:24 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:06:24.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:06:24 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:24 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:24 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:24 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224003a20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:24 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:06:24 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:06:24 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:06:24.306 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:06:25 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:25 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22400023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:26 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:06:26 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:06:26 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:06:26.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:06:26 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:26 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22400023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:26 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:26 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228004890 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:26 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:06:26 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:06:26 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:06:26.309 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:06:27 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:27 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22180046f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:28 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:06:28 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:06:28 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:06:28.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:06:28 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:28 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:28 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:28 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22400023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:28 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:06:28 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:06:28 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:06:28.312 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:06:28 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:06:29 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:29 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228004890 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:30 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:06:30 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:06:30 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:06:30.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:06:30 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:30 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22180046f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:30 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:30 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:30 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:06:30 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:06:30 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:06:30.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:06:31 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:31 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22400023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:32 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:06:32 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:06:32 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:06:32.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:06:32 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:32 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228004890 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:32 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:32 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22180046f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:32 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:06:32 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:06:32 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:06:32.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:06:33 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:33 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:33 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:06:34 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:06:34 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:06:34 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:06:34.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:06:34 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:34 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22400023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:34 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:34 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228004890 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:34 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:06:34 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:06:34 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:06:34.322 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:06:35 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:35 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22180046f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:36 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:06:36 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:06:36 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:06:36.180 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:06:36 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:36 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:36 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:36 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22400023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:36 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:06:36 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:06:36 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:06:36.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:06:36 np0005548916 nova_compute[228576]: 2025-12-06 10:06:36.470 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:06:36 np0005548916 nova_compute[228576]: 2025-12-06 10:06:36.471 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  6 05:06:37 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:37 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228004890 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:37 np0005548916 podman[231768]: 2025-12-06 10:06:37.779296653 +0000 UTC m=+0.088654289 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team)
Dec  6 05:06:38 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:06:38 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:06:38 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:06:38.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:06:38 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:38 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22180046f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:38 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:38 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:38 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:06:38 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:06:38 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:06:38.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:06:38 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:06:39 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:39 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22400053e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:39 np0005548916 nova_compute[228576]: 2025-12-06 10:06:39.469 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:06:39 np0005548916 nova_compute[228576]: 2025-12-06 10:06:39.470 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  6 05:06:39 np0005548916 nova_compute[228576]: 2025-12-06 10:06:39.470 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  6 05:06:39 np0005548916 nova_compute[228576]: 2025-12-06 10:06:39.510 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  6 05:06:39 np0005548916 nova_compute[228576]: 2025-12-06 10:06:39.511 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:06:39 np0005548916 nova_compute[228576]: 2025-12-06 10:06:39.511 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:06:39 np0005548916 nova_compute[228576]: 2025-12-06 10:06:39.539 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:06:39 np0005548916 nova_compute[228576]: 2025-12-06 10:06:39.541 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:06:39 np0005548916 nova_compute[228576]: 2025-12-06 10:06:39.542 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:06:39 np0005548916 nova_compute[228576]: 2025-12-06 10:06:39.543 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  6 05:06:39 np0005548916 nova_compute[228576]: 2025-12-06 10:06:39.545 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:06:39 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  6 05:06:39 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/172757125' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  6 05:06:39 np0005548916 nova_compute[228576]: 2025-12-06 10:06:39.984 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.439s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:06:40 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:06:40 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:06:40 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:06:40.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:06:40 np0005548916 nova_compute[228576]: 2025-12-06 10:06:40.195 228580 WARNING nova.virt.libvirt.driver [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  6 05:06:40 np0005548916 nova_compute[228576]: 2025-12-06 10:06:40.197 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5203MB free_disk=59.897621154785156GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  6 05:06:40 np0005548916 nova_compute[228576]: 2025-12-06 10:06:40.197 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:06:40 np0005548916 nova_compute[228576]: 2025-12-06 10:06:40.197 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:06:40 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:40 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228004890 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:40 np0005548916 nova_compute[228576]: 2025-12-06 10:06:40.270 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  6 05:06:40 np0005548916 nova_compute[228576]: 2025-12-06 10:06:40.271 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  6 05:06:40 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:40 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22180046f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:40 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/100640 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  6 05:06:40 np0005548916 nova_compute[228576]: 2025-12-06 10:06:40.297 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:06:40 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:06:40 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:06:40 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:06:40.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:06:40 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  6 05:06:40 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1754887358' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  6 05:06:40 np0005548916 nova_compute[228576]: 2025-12-06 10:06:40.755 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.458s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:06:40 np0005548916 nova_compute[228576]: 2025-12-06 10:06:40.760 228580 DEBUG nova.compute.provider_tree [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Inventory has not changed in ProviderTree for provider: ff2f17cb-ff1d-4da7-9560-4be741380cb1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  6 05:06:40 np0005548916 nova_compute[228576]: 2025-12-06 10:06:40.773 228580 DEBUG nova.scheduler.client.report [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Inventory has not changed for provider ff2f17cb-ff1d-4da7-9560-4be741380cb1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  6 05:06:40 np0005548916 nova_compute[228576]: 2025-12-06 10:06:40.775 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  6 05:06:40 np0005548916 nova_compute[228576]: 2025-12-06 10:06:40.775 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.578s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:06:41 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:41 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c0045f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:41 np0005548916 nova_compute[228576]: 2025-12-06 10:06:41.734 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:06:41 np0005548916 nova_compute[228576]: 2025-12-06 10:06:41.753 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:06:41 np0005548916 nova_compute[228576]: 2025-12-06 10:06:41.753 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:06:41 np0005548916 nova_compute[228576]: 2025-12-06 10:06:41.754 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:06:41 np0005548916 podman[231841]: 2025-12-06 10:06:41.755398169 +0000 UTC m=+0.056313200 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Dec  6 05:06:42 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:06:42 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:06:42 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:06:42.190 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:06:42 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:42 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22400053e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:42 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:42 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:42 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:06:42 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:06:42 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:06:42.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:06:42 np0005548916 nova_compute[228576]: 2025-12-06 10:06:42.483 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:06:43 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:43 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22180046f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:43 np0005548916 nova_compute[228576]: 2025-12-06 10:06:43.470 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:06:43 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:06:44 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:06:44 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:06:44 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:06:44.194 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:06:44 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:44 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c0045f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:44 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:44 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22400053e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:44 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:06:44 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:06:44 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:06:44.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:06:45 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:45 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c003760 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:45 np0005548916 podman[231862]: 2025-12-06 10:06:45.763115678 +0000 UTC m=+0.068081453 container health_status ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec  6 05:06:46 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:06:46 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:06:46 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:06:46.196 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:06:46 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:46 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22180046f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:46 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:46 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c0045f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:46 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:06:46 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:06:46 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:06:46.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:06:47 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:47 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22400053e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:48 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:06:48 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:06:48 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:06:48.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:06:48 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:48 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c003760 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:48 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:48 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22180046f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:48 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:06:48 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:06:48 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:06:48.346 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:06:48 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:06:49 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:49 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c0045f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:50 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:06:50 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:06:50 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:06:50.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:06:50 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:50 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22400053e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:50 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:50 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c003760 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:50 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:06:50 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:06:50 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:06:50.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:06:50 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:50 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:06:51 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:51 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c0045f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:52 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:06:52 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:06:52 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:06:52.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:06:52 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:52 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22180046f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:52 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:52 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22180046f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:52 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:06:52 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:06:52 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:06:52.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:06:53 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:53 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c003760 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:53 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:06:53 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:53 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:06:53 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:53 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:06:54 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:06:54 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:06:54 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:06:54.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:06:54 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:54 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c0045f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:06:54.280 141446 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:06:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:06:54.281 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:06:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:06:54.281 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:06:54 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:54 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c0045f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:54 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:06:54 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:06:54 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:06:54.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:06:55 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/100655 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  6 05:06:55 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:55 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218004710 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:56 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:06:56 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:06:56 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:06:56.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:06:56 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:56 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c003760 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:56 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:56 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22400053e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:56 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:06:56 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:06:56 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:06:56.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:06:57 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:57 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c0045f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:58 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:06:58 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:06:58 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:06:58.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:06:58 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:58 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218004730 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:58 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:58 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224002e50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:58 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:06:58 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:06:58 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:06:58.364 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:06:58 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:06:59 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:59 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22400053e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:00 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:07:00 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:07:00 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:07:00.218 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:07:00 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:00 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22400053e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:00 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:00 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218004750 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:00 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:07:00 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:07:00 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:07:00.367 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:07:00 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:00 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:07:00 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:00 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:07:01 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:01 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224002e50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:01 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  6 05:07:01 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:07:01 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:07:01 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  6 05:07:02 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:07:02 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:07:02 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:07:02.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:07:02 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:02 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22400053e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:02 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:02 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22400053e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:02 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:07:02 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:07:02 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:07:02.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:07:03 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:03 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218004770 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:03 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:03 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  6 05:07:03 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:03 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:07:03 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:07:04 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:07:04 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:07:04 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:07:04.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:07:04 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:04 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224002e50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:04 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:04 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c0045f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:04 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:07:04 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:07:04 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:07:04.372 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:07:05 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:05 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240005400 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:06 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:07:06 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:07:06 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:07:06.227 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:07:06 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:06 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218004790 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:06 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:06 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c0045f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:06 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:07:06 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:07:06 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:07:06.375 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:07:06 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:06 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:07:06 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:06 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:07:06 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:06 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:07:06 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:06 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:07:06 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:07:06 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:07:07 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:07 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224002e50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:08 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:07:08 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:07:08 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:07:08.230 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:07:08 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:08 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240005420 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:08 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:08 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22180047b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:08 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:07:08 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:07:08 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:07:08.378 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:07:08 np0005548916 podman[232027]: 2025-12-06 10:07:08.777964173 +0000 UTC m=+0.082493851 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0)
Dec  6 05:07:08 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:07:09 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:09 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c0045f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:09 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:09 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  6 05:07:10 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:07:10 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:07:10 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:07:10.233 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:07:10 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:10 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224002e50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:10 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:10 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240005440 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:10 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/100710 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  6 05:07:10 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:07:10 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:07:10 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:07:10.381 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:07:11 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:11 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22180047d0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:12 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:07:12 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:07:12 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:07:12.237 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:07:12 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:12 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c0045f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:12 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:12 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224002e50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:12 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:07:12 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:07:12 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:07:12.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:07:12 np0005548916 podman[232081]: 2025-12-06 10:07:12.769403058 +0000 UTC m=+0.065640004 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  6 05:07:12 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:07:12.941 141446 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=5, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '9a:dc:0d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'b6:0a:c4:b8:be:39'}, ipsec=False) old=SB_Global(nb_cfg=4) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  6 05:07:12 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:07:12.942 141446 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  6 05:07:12 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:07:12.943 141446 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=61eba479-a995-4b31-88b9-8ebfcea9907e, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '5'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  6 05:07:13 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:13 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240005460 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:13 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:07:14 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:07:14 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:07:14 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:07:14.240 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:07:14 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:14 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22180047f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:14 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:14 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228002180 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:14 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:07:14 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:07:14 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:07:14.388 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:07:15 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/100715 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  6 05:07:15 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:15 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224002e50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:16 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:07:16 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:07:16 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:07:16.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:07:16 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:16 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240005480 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:16 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:16 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22180047f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:16 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:07:16 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:07:16 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:07:16.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:07:16 np0005548916 podman[232104]: 2025-12-06 10:07:16.750095996 +0000 UTC m=+0.059443734 container health_status ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.license=GPLv2, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec  6 05:07:17 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:17 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228002180 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:18 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:07:18 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:07:18 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:07:18.246 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:07:18 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:18 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224002e50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:18 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:18 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22400054a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:18 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:07:18 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:07:18 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:07:18.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:07:18 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:07:19 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:19 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218004810 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:20 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:07:20 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:07:20 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:07:20.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:07:20 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:20 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228002180 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:20 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:20 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224002e50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:20 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:07:20 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:07:20 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:07:20.398 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:07:21 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:21 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22400054c0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:22 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:07:22 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:07:22 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:07:22.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:07:22 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:22 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218004830 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:22 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:22 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228002320 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:22 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:07:22 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:07:22 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:07:22.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:07:23 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:23 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224002e50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:23 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:07:23 np0005548916 ceph-mon[79770]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #49. Immutable memtables: 0.
Dec  6 05:07:23 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:07:23.912094) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec  6 05:07:23 np0005548916 ceph-mon[79770]: rocksdb: [db/flush_job.cc:856] [default] [JOB 27] Flushing memtable with next log file: 49
Dec  6 05:07:23 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015643912321, "job": 27, "event": "flush_started", "num_memtables": 1, "num_entries": 1207, "num_deletes": 251, "total_data_size": 2821937, "memory_usage": 2847248, "flush_reason": "Manual Compaction"}
Dec  6 05:07:23 np0005548916 ceph-mon[79770]: rocksdb: [db/flush_job.cc:885] [default] [JOB 27] Level-0 flush table #50: started
Dec  6 05:07:23 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015643925331, "cf_name": "default", "job": 27, "event": "table_file_creation", "file_number": 50, "file_size": 1820143, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 24769, "largest_seqno": 25971, "table_properties": {"data_size": 1815003, "index_size": 2600, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1477, "raw_key_size": 11653, "raw_average_key_size": 19, "raw_value_size": 1804422, "raw_average_value_size": 3084, "num_data_blocks": 116, "num_entries": 585, "num_filter_entries": 585, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765015553, "oldest_key_time": 1765015553, "file_creation_time": 1765015643, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79d3ff52-4bd2-4fdc-8b55-33dd9c53d8e0", "db_session_id": "1TK25AVRA1WQDS4JHM8T", "orig_file_number": 50, "seqno_to_time_mapping": "N/A"}}
Dec  6 05:07:23 np0005548916 ceph-mon[79770]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 27] Flush lasted 13293 microseconds, and 5530 cpu microseconds.
Dec  6 05:07:23 np0005548916 ceph-mon[79770]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  6 05:07:23 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:07:23.925405) [db/flush_job.cc:967] [default] [JOB 27] Level-0 flush table #50: 1820143 bytes OK
Dec  6 05:07:23 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:07:23.925438) [db/memtable_list.cc:519] [default] Level-0 commit table #50 started
Dec  6 05:07:23 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:07:23.927446) [db/memtable_list.cc:722] [default] Level-0 commit table #50: memtable #1 done
Dec  6 05:07:23 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:07:23.927487) EVENT_LOG_v1 {"time_micros": 1765015643927477, "job": 27, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec  6 05:07:23 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:07:23.927512) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec  6 05:07:23 np0005548916 ceph-mon[79770]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 27] Try to delete WAL files size 2816081, prev total WAL file size 2816081, number of live WAL files 2.
Dec  6 05:07:23 np0005548916 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000046.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 05:07:23 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:07:23.928507) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031373537' seq:72057594037927935, type:22 .. '7061786F730032303039' seq:0, type:0; will stop at (end)
Dec  6 05:07:23 np0005548916 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 28] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec  6 05:07:23 np0005548916 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 27 Base level 0, inputs: [50(1777KB)], [48(13MB)]
Dec  6 05:07:23 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015643928678, "job": 28, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [50], "files_L6": [48], "score": -1, "input_data_size": 15534945, "oldest_snapshot_seqno": -1}
Dec  6 05:07:23 np0005548916 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 28] Generated table #51: 5433 keys, 13334205 bytes, temperature: kUnknown
Dec  6 05:07:23 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015643991238, "cf_name": "default", "job": 28, "event": "table_file_creation", "file_number": 51, "file_size": 13334205, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13297953, "index_size": 21550, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 13637, "raw_key_size": 139443, "raw_average_key_size": 25, "raw_value_size": 13199507, "raw_average_value_size": 2429, "num_data_blocks": 875, "num_entries": 5433, "num_filter_entries": 5433, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765014012, "oldest_key_time": 0, "file_creation_time": 1765015643, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79d3ff52-4bd2-4fdc-8b55-33dd9c53d8e0", "db_session_id": "1TK25AVRA1WQDS4JHM8T", "orig_file_number": 51, "seqno_to_time_mapping": "N/A"}}
Dec  6 05:07:23 np0005548916 ceph-mon[79770]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  6 05:07:23 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:07:23.991698) [db/compaction/compaction_job.cc:1663] [default] [JOB 28] Compacted 1@0 + 1@6 files to L6 => 13334205 bytes
Dec  6 05:07:23 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:07:23.993174) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 247.7 rd, 212.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.7, 13.1 +0.0 blob) out(12.7 +0.0 blob), read-write-amplify(15.9) write-amplify(7.3) OK, records in: 5950, records dropped: 517 output_compression: NoCompression
Dec  6 05:07:23 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:07:23.993217) EVENT_LOG_v1 {"time_micros": 1765015643993197, "job": 28, "event": "compaction_finished", "compaction_time_micros": 62705, "compaction_time_cpu_micros": 28249, "output_level": 6, "num_output_files": 1, "total_output_size": 13334205, "num_input_records": 5950, "num_output_records": 5433, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec  6 05:07:23 np0005548916 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000050.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 05:07:23 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015643994217, "job": 28, "event": "table_file_deletion", "file_number": 50}
Dec  6 05:07:23 np0005548916 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000048.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 05:07:23 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015643999696, "job": 28, "event": "table_file_deletion", "file_number": 48}
Dec  6 05:07:24 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:07:23.928292) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:07:24 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:07:23.999780) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:07:24 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:07:23.999788) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:07:24 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:07:23.999791) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:07:24 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:07:23.999794) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:07:24 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:07:23.999798) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:07:24 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:07:24 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:07:24 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:07:24.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:07:24 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:24 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22400054e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:24 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:24 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218004850 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:24 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:07:24 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:07:24 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:07:24.406 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:07:25 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:25 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228002320 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:26 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:07:26 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:07:26 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:07:26.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:07:26 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:26 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228002320 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:26 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:26 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240005500 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:26 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:07:26 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:07:26 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:07:26.409 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:07:27 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:27 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218004870 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:28 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:07:28 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:07:28 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:07:28.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:07:28 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:28 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228002320 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:28 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:28 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228002320 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:28 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:07:28 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:07:28 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:07:28.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:07:28 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:07:29 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:29 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240005520 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:30 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:07:30 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:07:30 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:07:30.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:07:30 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:30 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218004890 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:30 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:30 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c003370 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:30 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:07:30 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:07:30 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:07:30.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:07:31 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:31 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228002320 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:32 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:07:32 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:07:32 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:07:32.268 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:07:32 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:32 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240005540 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:32 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:32 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22180048b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:32 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:07:32 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:07:32 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:07:32.421 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:07:33 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:33 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c003370 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:33 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:07:34 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:07:34 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:07:34 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:07:34.271 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:07:34 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:34 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228002320 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:34 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:34 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240005560 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:34 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:07:34 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:07:34 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:07:34.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:07:35 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:35 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800c520 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:36 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:07:36 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:07:36 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:07:36.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:07:36 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:36 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c003370 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:36 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:36 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228002320 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:36 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:07:36 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:07:36 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:07:36.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:07:36 np0005548916 nova_compute[228576]: 2025-12-06 10:07:36.471 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:07:36 np0005548916 nova_compute[228576]: 2025-12-06 10:07:36.471 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  6 05:07:37 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:37 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240005580 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:38 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:07:38 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:07:38 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:07:38.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:07:38 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:38 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800c520 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:38 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:38 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c003370 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:38 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:07:38 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:07:38 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:07:38.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:07:38 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:07:39 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:39 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228002320 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:39 np0005548916 podman[232161]: 2025-12-06 10:07:39.81174009 +0000 UTC m=+0.101618532 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Dec  6 05:07:40 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:07:40 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:07:40 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:07:40.281 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:07:40 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:40 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22400055a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:40 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:40 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800c520 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:40 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:07:40 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:07:40 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:07:40.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:07:40 np0005548916 nova_compute[228576]: 2025-12-06 10:07:40.471 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:07:40 np0005548916 nova_compute[228576]: 2025-12-06 10:07:40.471 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  6 05:07:40 np0005548916 nova_compute[228576]: 2025-12-06 10:07:40.471 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  6 05:07:40 np0005548916 nova_compute[228576]: 2025-12-06 10:07:40.487 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  6 05:07:40 np0005548916 nova_compute[228576]: 2025-12-06 10:07:40.488 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:07:41 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:41 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c003370 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:41 np0005548916 nova_compute[228576]: 2025-12-06 10:07:41.470 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:07:41 np0005548916 nova_compute[228576]: 2025-12-06 10:07:41.470 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:07:41 np0005548916 nova_compute[228576]: 2025-12-06 10:07:41.493 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:07:41 np0005548916 nova_compute[228576]: 2025-12-06 10:07:41.494 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:07:41 np0005548916 nova_compute[228576]: 2025-12-06 10:07:41.494 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:07:41 np0005548916 nova_compute[228576]: 2025-12-06 10:07:41.494 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  6 05:07:41 np0005548916 nova_compute[228576]: 2025-12-06 10:07:41.494 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:07:41 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  6 05:07:41 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/656854972' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  6 05:07:41 np0005548916 nova_compute[228576]: 2025-12-06 10:07:41.935 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.441s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:07:42 np0005548916 nova_compute[228576]: 2025-12-06 10:07:42.092 228580 WARNING nova.virt.libvirt.driver [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  6 05:07:42 np0005548916 nova_compute[228576]: 2025-12-06 10:07:42.094 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5198MB free_disk=59.96738052368164GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  6 05:07:42 np0005548916 nova_compute[228576]: 2025-12-06 10:07:42.094 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:07:42 np0005548916 nova_compute[228576]: 2025-12-06 10:07:42.095 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:07:42 np0005548916 nova_compute[228576]: 2025-12-06 10:07:42.159 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  6 05:07:42 np0005548916 nova_compute[228576]: 2025-12-06 10:07:42.160 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  6 05:07:42 np0005548916 nova_compute[228576]: 2025-12-06 10:07:42.175 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:07:42 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:07:42 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:07:42 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:07:42.283 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:07:42 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:42 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228004c80 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:42 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:42 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22400055c0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:42 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:07:42 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:07:42 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:07:42.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:07:42 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  6 05:07:42 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4086081083' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  6 05:07:42 np0005548916 nova_compute[228576]: 2025-12-06 10:07:42.663 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.488s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:07:42 np0005548916 nova_compute[228576]: 2025-12-06 10:07:42.670 228580 DEBUG nova.compute.provider_tree [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Inventory has not changed in ProviderTree for provider: ff2f17cb-ff1d-4da7-9560-4be741380cb1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  6 05:07:42 np0005548916 nova_compute[228576]: 2025-12-06 10:07:42.697 228580 DEBUG nova.scheduler.client.report [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Inventory has not changed for provider ff2f17cb-ff1d-4da7-9560-4be741380cb1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  6 05:07:42 np0005548916 nova_compute[228576]: 2025-12-06 10:07:42.699 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  6 05:07:42 np0005548916 nova_compute[228576]: 2025-12-06 10:07:42.699 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.604s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:07:43 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:43 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800c520 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:43 np0005548916 nova_compute[228576]: 2025-12-06 10:07:43.700 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:07:43 np0005548916 nova_compute[228576]: 2025-12-06 10:07:43.701 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:07:43 np0005548916 nova_compute[228576]: 2025-12-06 10:07:43.701 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:07:43 np0005548916 nova_compute[228576]: 2025-12-06 10:07:43.701 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:07:43 np0005548916 podman[232235]: 2025-12-06 10:07:43.758190381 +0000 UTC m=+0.061535015 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec  6 05:07:43 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:07:44 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:07:44 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:07:44 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:07:44.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:07:44 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:44 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c003370 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:44 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:44 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228004c80 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:44 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:07:44 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:07:44 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:07:44.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:07:45 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:45 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22400055e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:46 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:07:46 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:07:46 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:07:46.289 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:07:46 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:46 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800c520 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:46 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:46 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800c520 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:46 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:07:46 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:07:46 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:07:46.442 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:07:47 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:47 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228004c80 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:47 np0005548916 podman[232258]: 2025-12-06 10:07:47.746406432 +0000 UTC m=+0.055655664 container health_status ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Dec  6 05:07:48 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:07:48 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:07:48 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:07:48.292 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:07:48 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:48 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22400055e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:48 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:48 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c001090 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:48 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:07:48 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:07:48 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:07:48.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:07:48 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:07:49 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:49 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800c520 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:50 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:07:50 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:07:50 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:07:50.296 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:07:50 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:50 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228004c80 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:50 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:50 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228004c80 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:50 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:07:50 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:07:50 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:07:50.448 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:07:51 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:51 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c001bd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:52 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:07:52 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:07:52 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:07:52.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:07:52 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:52 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800c520 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:52 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:52 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800c520 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:52 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:07:52 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:07:52 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:07:52.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:07:53 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:53 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240005640 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:53 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:07:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:07:54.281 141446 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:07:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:07:54.281 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:07:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:07:54.281 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:07:54 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:07:54 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:07:54 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:07:54.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:07:54 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:54 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c001bd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:54 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:54 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c001bd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:54 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:07:54 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:07:54 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:07:54.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:07:55 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:55 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800c520 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:56 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:07:56 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:07:56 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:07:56.305 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:07:56 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:56 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240005660 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:56 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:56 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c001bd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:56 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:07:56 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:07:56 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:07:56.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:07:57 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:57 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c001bd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:58 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:07:58 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:07:58 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:07:58.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:07:58 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:58 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800c520 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:58 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:58 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240005680 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:58 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:07:58 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:07:58 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:07:58.459 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:07:58 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:07:59 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:59 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c001bd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:00 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:08:00 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:08:00 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:08:00.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:08:00 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:00 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228004c80 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:00 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:00 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800c520 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:00 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:08:00 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:08:00 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:08:00.463 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:08:01 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:01 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22400056a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:02 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:08:02 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:08:02 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:08:02.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:08:02 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:02 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c001bd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:02 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:02 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224002e50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:02 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:08:02 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:08:02 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:08:02.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:08:03 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:03 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800c520 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:03 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:08:04 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:08:04 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:08:04 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:08:04.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:08:04 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:04 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22400056a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:04 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:04 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c001bd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:04 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:08:04 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:08:04 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:08:04.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:08:05 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:05 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224002e50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:06 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:08:06 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:08:06 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:08:06.320 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:08:06 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:06 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800c520 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:06 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:06 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22400056a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:06 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:08:06 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:08:06 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:08:06.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:08:07 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:07 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c001bd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:08 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  6 05:08:08 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:08:08 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:08:08 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  6 05:08:08 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:08:08 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:08:08 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:08:08.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:08:08 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:08 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224002e50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:08 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:08 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800c520 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:08 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:08:08 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:08:08 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:08:08.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:08:08 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:08:09 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:09 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800c520 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:10 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:10 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c001bd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:10 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:08:10 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:08:10 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:08:10.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:08:10 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:10 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224002e50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:10 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:08:10 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:08:10 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:08:10.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:08:10 np0005548916 podman[232399]: 2025-12-06 10:08:10.811751465 +0000 UTC m=+0.106699545 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125)
Dec  6 05:08:11 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:11 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22400056a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:12 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:12 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800c540 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:12 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:08:12 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:08:12 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:08:12.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:08:12 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:12 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c001bd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:12 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:08:12 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:08:12 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:08:12.482 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:08:13 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:13 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224002e50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:13 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:08:14 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:14 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22400056c0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:14 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:08:14 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:08:14 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:08:14.332 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:08:14 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:14 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800c560 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:14 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:08:14 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:08:14 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:08:14.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:08:14 np0005548916 podman[232452]: 2025-12-06 10:08:14.773604146 +0000 UTC m=+0.080483622 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  6 05:08:14 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:08:14.915 141446 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=6, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '9a:dc:0d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'b6:0a:c4:b8:be:39'}, ipsec=False) old=SB_Global(nb_cfg=5) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  6 05:08:14 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:08:14.916 141446 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  6 05:08:15 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:15 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c001bd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:16 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:08:16 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:08:16 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:16 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224002e50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:16 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:08:16 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:08:16 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:08:16.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:08:16 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:16 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22400056e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:16 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:08:16 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:08:16 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:08:16.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:08:17 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:17 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800c580 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:18 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:18 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c001bd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:18 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:08:18 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:08:18 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:08:18.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:08:18 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:18 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c003370 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:18 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:08:18 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:08:18 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:08:18.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:08:18 np0005548916 podman[232499]: 2025-12-06 10:08:18.760054283 +0000 UTC m=+0.062608721 container health_status ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec  6 05:08:18 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:08:19 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:19 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_34] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240005790 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:20 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:20 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800c5a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:20 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:08:20 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:08:20 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:08:20.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:08:20 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:20 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800c5a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:20 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:08:20 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:08:20 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:08:20.497 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:08:21 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:21 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800c5a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:22 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:22 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224002e50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:22 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:08:22 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:08:22 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:08:22.346 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:08:22 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:22 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240005860 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:22 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:08:22 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:08:22 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:08:22.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:08:22 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:08:22.919 141446 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=61eba479-a995-4b31-88b9-8ebfcea9907e, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '6'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  6 05:08:23 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:23 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2210000d00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:23 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:08:24 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:24 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800c5c0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:24 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:24 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224002e50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:24 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:08:24 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:08:24 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:08:24.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:08:24 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:08:24 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:08:24 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:08:24.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:08:25 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:25 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240005880 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:26 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:26 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2210001820 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:26 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:26 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800c5e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:26 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:08:26 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:08:26 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:08:26.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:08:26 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:08:26 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:08:26 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:08:26.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:08:27 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:27 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224002e50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:28 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:28 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22400058a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:28 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:28 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2210001820 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:28 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:08:28 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:08:28 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:08:28.354 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:08:28 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:08:28 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:08:28 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:08:28.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:08:28 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:08:29 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:29 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800c600 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:30 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:30 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224004e70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:30 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:30 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22400058c0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:30 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:08:30 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:08:30 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:08:30.357 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:08:30 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:08:30 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:08:30 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:08:30.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:08:31 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:31 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2210001820 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:32 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:32 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800c620 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:32 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:32 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224004e70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:32 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:08:32 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:08:32 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:08:32.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:08:32 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:08:32 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:08:32 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:08:32.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:08:33 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:33 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22400058e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:33 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:08:34 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:34 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2210002cb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:34 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:34 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800c640 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:34 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:08:34 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:08:34 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:08:34.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:08:34 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:08:34 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:08:34 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:08:34.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:08:35 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:35 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224004e70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:36 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:36 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240005900 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:36 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:36 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240005900 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:36 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:08:36 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:08:36 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:08:36.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:08:36 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:08:36 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:08:36 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:08:36.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:08:37 np0005548916 nova_compute[228576]: 2025-12-06 10:08:37.470 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:08:37 np0005548916 nova_compute[228576]: 2025-12-06 10:08:37.471 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  6 05:08:37 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:37 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800c640 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:38 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:38 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224004e70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:38 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:38 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240005900 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:38 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:08:38 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:08:38 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:08:38.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:08:38 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:08:38 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:08:38 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:08:38.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:08:38 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:08:39 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:39 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2210002cb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:40 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:40 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800c660 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:40 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:40 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224004e70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:40 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:08:40 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:08:40 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:08:40.372 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:08:40 np0005548916 nova_compute[228576]: 2025-12-06 10:08:40.465 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:08:40 np0005548916 nova_compute[228576]: 2025-12-06 10:08:40.482 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:08:40 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:08:40 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:08:40 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:08:40.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:08:41 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:41 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240005920 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:41 np0005548916 podman[232559]: 2025-12-06 10:08:41.783126831 +0000 UTC m=+0.085820455 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec  6 05:08:42 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:42 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2210002cb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:42 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:42 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800c680 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:42 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:08:42 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:08:42 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:08:42.375 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:08:42 np0005548916 nova_compute[228576]: 2025-12-06 10:08:42.470 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:08:42 np0005548916 nova_compute[228576]: 2025-12-06 10:08:42.471 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  6 05:08:42 np0005548916 nova_compute[228576]: 2025-12-06 10:08:42.471 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  6 05:08:42 np0005548916 nova_compute[228576]: 2025-12-06 10:08:42.488 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  6 05:08:42 np0005548916 nova_compute[228576]: 2025-12-06 10:08:42.488 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:08:42 np0005548916 nova_compute[228576]: 2025-12-06 10:08:42.489 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:08:42 np0005548916 nova_compute[228576]: 2025-12-06 10:08:42.516 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:08:42 np0005548916 nova_compute[228576]: 2025-12-06 10:08:42.517 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:08:42 np0005548916 nova_compute[228576]: 2025-12-06 10:08:42.517 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:08:42 np0005548916 nova_compute[228576]: 2025-12-06 10:08:42.517 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  6 05:08:42 np0005548916 nova_compute[228576]: 2025-12-06 10:08:42.518 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:08:42 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:08:42 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:08:42 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:08:42.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:08:42 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  6 05:08:42 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3506121761' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  6 05:08:42 np0005548916 nova_compute[228576]: 2025-12-06 10:08:42.984 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.465s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:08:43 np0005548916 nova_compute[228576]: 2025-12-06 10:08:43.146 228580 WARNING nova.virt.libvirt.driver [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  6 05:08:43 np0005548916 nova_compute[228576]: 2025-12-06 10:08:43.148 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5209MB free_disk=59.96752166748047GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  6 05:08:43 np0005548916 nova_compute[228576]: 2025-12-06 10:08:43.148 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:08:43 np0005548916 nova_compute[228576]: 2025-12-06 10:08:43.148 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:08:43 np0005548916 nova_compute[228576]: 2025-12-06 10:08:43.213 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  6 05:08:43 np0005548916 nova_compute[228576]: 2025-12-06 10:08:43.213 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  6 05:08:43 np0005548916 nova_compute[228576]: 2025-12-06 10:08:43.244 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:08:43 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:43 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224004e70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:43 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  6 05:08:43 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1405804968' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  6 05:08:43 np0005548916 nova_compute[228576]: 2025-12-06 10:08:43.690 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.447s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:08:43 np0005548916 nova_compute[228576]: 2025-12-06 10:08:43.695 228580 DEBUG nova.compute.provider_tree [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Inventory has not changed in ProviderTree for provider: ff2f17cb-ff1d-4da7-9560-4be741380cb1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  6 05:08:43 np0005548916 nova_compute[228576]: 2025-12-06 10:08:43.712 228580 DEBUG nova.scheduler.client.report [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Inventory has not changed for provider ff2f17cb-ff1d-4da7-9560-4be741380cb1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  6 05:08:43 np0005548916 nova_compute[228576]: 2025-12-06 10:08:43.713 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  6 05:08:43 np0005548916 nova_compute[228576]: 2025-12-06 10:08:43.714 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.565s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:08:43 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:08:44 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:44 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240005940 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:44 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:44 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2210003db0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:44 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:08:44 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:08:44 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:08:44.378 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:08:44 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/100844 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  6 05:08:44 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:08:44 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:08:44 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:08:44.531 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:08:45 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:45 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800c6a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:45 np0005548916 nova_compute[228576]: 2025-12-06 10:08:45.695 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:08:45 np0005548916 nova_compute[228576]: 2025-12-06 10:08:45.695 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:08:45 np0005548916 nova_compute[228576]: 2025-12-06 10:08:45.695 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:08:45 np0005548916 nova_compute[228576]: 2025-12-06 10:08:45.696 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:08:45 np0005548916 podman[232631]: 2025-12-06 10:08:45.766783382 +0000 UTC m=+0.074121706 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent)
Dec  6 05:08:46 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:46 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224004e70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:46 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:46 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240005960 fd 50 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:46 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:08:46 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:08:46 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:08:46.380 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:08:46 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:08:46 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:08:46 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:08:46.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:08:47 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:47 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2210003db0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:48 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:48 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2210003db0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:48 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:48 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224004e70 fd 49 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:48 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:08:48 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:08:48 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:08:48.383 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:08:48 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:08:48 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:08:48 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:08:48.537 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:08:48 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:08:49 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:49 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2210003db0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:49 np0005548916 podman[232653]: 2025-12-06 10:08:49.784759144 +0000 UTC m=+0.084801200 container health_status ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  6 05:08:50 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:50 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800c6e0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:50 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:50 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2210003db0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:50 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:08:50 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:08:50 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:08:50.386 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:08:50 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:08:50 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:08:50 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:08:50.539 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:08:51 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:51 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224004e70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:52 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:52 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2210003db0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:52 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:52 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c001bd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:52 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:08:52 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:08:52 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:08:52.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:08:52 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:08:52 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:08:52 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:08:52.542 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:08:53 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:53 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_37] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280030f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:53 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:08:54 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:54 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:08:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:08:54.282 141446 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:08:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:08:54.282 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:08:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:08:54.282 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:08:54 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:54 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224004e70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:54 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:54 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2210003db0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:54 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:08:54 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:08:54 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:08:54.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:08:54 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:08:54 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:08:54 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:08:54.545 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:08:55 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:55 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c001bd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:56 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:56 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_37] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280030f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:56 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:56 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280030f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:56 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:08:56 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:08:56 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:08:56.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:08:56 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:08:56 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:08:56 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:08:56.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:08:57 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:57 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:08:57 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:57 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:08:57 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:57 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:08:57 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:57 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2210003db0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:58 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:58 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c001bd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:58 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:58 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_37] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280030f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:58 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:08:58 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:08:58 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:08:58.400 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:08:58 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:08:58 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:08:58 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:08:58.552 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:08:58 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:08:59 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:59 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224004e70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:00 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:00 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  6 05:09:00 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:00 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2210003db0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:00 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:00 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c001bd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:00 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:09:00 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:09:00 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:09:00.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:09:00 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:09:00 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:09:00 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:09:00.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:09:01 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:01 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_37] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280030f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:02 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:02 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224004e70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:02 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:02 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2210003db0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:02 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:09:02 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:09:02 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:09:02.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:09:02 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:09:02 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:09:02 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:09:02.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:09:03 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:03 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c001bd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:03 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:09:04 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:04 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_37] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280030f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:04 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:04 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224004e70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:04 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:09:04 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:09:04 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:09:04.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:09:04 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:09:04 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:09:04 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:09:04.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:09:05 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:05 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2210003db0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:06 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:06 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c001bd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:06 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:06 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_37] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280030f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:06 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:09:06 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:09:06 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:09:06.414 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:09:06 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/100906 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  6 05:09:06 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:09:06 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:09:06 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:09:06.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:09:07 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:07 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224004e70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:08 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:08 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2210003db0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:08 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:08 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c001bd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:08 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:09:08 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:09:08 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:09:08.417 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:09:08 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:09:08 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:09:08 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:09:08.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:09:08 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:09:09 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:09 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_37] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280030f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:10 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:10 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224004e70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:10 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:10 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2210003db0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:10 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:09:10 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:09:10 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:09:10.421 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:09:10 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:09:10 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:09:10 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:09:10.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:09:11 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:11 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c001bd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:12 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:12 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_37] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280030f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:12 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:12 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224004e70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:12 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:09:12 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:09:12 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:09:12.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:09:12 np0005548916 podman[232736]: 2025-12-06 10:09:12.445201289 +0000 UTC m=+0.088915005 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3)
Dec  6 05:09:12 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:09:12 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:09:12 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:09:12.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:09:13 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:13 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2210003db0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:13 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:09:14 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:14 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c001bd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:14 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:14 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_37] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280030f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:14 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:09:14 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:09:14 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:09:14.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:09:14 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:09:14 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:09:14 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:09:14.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:09:15 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:15 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224004e70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:15 np0005548916 podman[232833]: 2025-12-06 10:09:15.932101928 +0000 UTC m=+0.060518114 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec  6 05:09:16 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:09:16 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:09:16 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:09:16 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:09:16 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:16 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2210003db0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:16 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:16 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c001bd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:16 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:09:16 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:09:16 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:09:16.430 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:09:16 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:09:16 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:09:16 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:09:16.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:09:17 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  6 05:09:17 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:09:17 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:09:17 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  6 05:09:17 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:17 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_37] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280030f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:17 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:09:17.798 141446 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=7, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '9a:dc:0d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'b6:0a:c4:b8:be:39'}, ipsec=False) old=SB_Global(nb_cfg=6) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  6 05:09:17 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:09:17.801 141446 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  6 05:09:18 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:18 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224004e70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:18 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:18 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224004e70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:18 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:09:18 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:09:18 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:09:18.433 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:09:18 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:09:18 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:09:18 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:09:18.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:09:18 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:09:19 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:19 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c001bf0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:20 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:20 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_37] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280030f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:20 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:20 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280030f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:20 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:09:20 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:09:20 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:09:20.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:09:20 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:09:20 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:09:20 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:09:20.585 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:09:20 np0005548916 podman[232941]: 2025-12-06 10:09:20.746374651 +0000 UTC m=+0.057800537 container health_status ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=multipathd)
Dec  6 05:09:21 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:21 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280030f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:22 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:22 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c001c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:22 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:22 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_37] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004190 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:22 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:09:22 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:09:22 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:09:22.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:09:22 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:09:22 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:09:22 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:09:22.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:09:23 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:23 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_39] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224004e70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:23 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:09:23 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:09:23 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:09:24 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:24 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_39] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2210003db0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:24 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:24 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224004e70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:24 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:09:24 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:09:24 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:09:24.441 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:09:24 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:09:24 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:09:24 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:09:24.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:09:25 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:25 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c001c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:25 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:09:25.803 141446 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=61eba479-a995-4b31-88b9-8ebfcea9907e, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '7'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  6 05:09:26 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:26 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_37] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004190 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:26 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:26 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_39] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2210003db0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:26 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:09:26 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:09:26 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:09:26.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:09:26 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:09:26 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:09:26 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:09:26.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:09:27 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:27 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2210003db0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:28 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:28 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c001c30 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:28 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:28 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_37] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004190 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:28 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:09:28 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:09:28 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:09:28.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:09:28 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:09:28 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:09:28 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:09:28.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:09:28 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:09:29 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:29 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004190 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:30 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:30 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004190 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:30 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:30 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c001c50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:30 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:09:30 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:09:30 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:09:30.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:09:30 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:09:30 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:09:30 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:09:30.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:09:31 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:31 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c001c50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:32 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:32 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_37] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218001230 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:32 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:32 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_37] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218001230 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:32 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:09:32 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:09:32 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:09:32.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:09:32 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:09:32 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:09:32 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:09:32.603 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:09:33 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:33 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2210003db0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:33 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:09:34 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:34 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_37] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2210003db0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:34 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:34 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_37] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218001230 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:34 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:09:34 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:09:34 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:09:34.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:09:34 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:09:34 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:09:34 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:09:34.607 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:09:35 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:35 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_41] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280030f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:35 np0005548916 ceph-osd[77465]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec  6 05:09:35 np0005548916 ceph-osd[77465]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1800.1 total, 600.0 interval#012Cumulative writes: 9251 writes, 35K keys, 9251 commit groups, 1.0 writes per commit group, ingest: 0.03 GB, 0.01 MB/s#012Cumulative WAL: 9251 writes, 2253 syncs, 4.11 writes per sync, written: 0.03 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1447 writes, 4631 keys, 1447 commit groups, 1.0 writes per commit group, ingest: 5.55 MB, 0.01 MB/s#012Interval WAL: 1447 writes, 614 syncs, 2.36 writes per sync, written: 0.01 GB, 0.01 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec  6 05:09:36 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:36 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2210003db0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:36 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:36 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c004dd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:36 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:09:36 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:09:36 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:09:36.459 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:09:36 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:09:36 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:09:36 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:09:36.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:09:37 np0005548916 nova_compute[228576]: 2025-12-06 10:09:37.470 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:09:37 np0005548916 nova_compute[228576]: 2025-12-06 10:09:37.471 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  6 05:09:37 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:37 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_37] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218001230 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:38 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:38 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218001230 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:38 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:38 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2210003db0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:38 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:09:38 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:09:38 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:09:38.463 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:09:38 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:09:38 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:09:38 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:09:38.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:09:38 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:09:39 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:39 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_37] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2210003db0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:40 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:40 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218001230 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:40 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:40 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218001230 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:40 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/100940 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  6 05:09:40 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:09:40 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:09:40 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:09:40.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:09:40 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:09:40 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:09:40 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:09:40.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:09:41 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:41 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c004dd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:42 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:42 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c004dd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:42 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:42 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218001230 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:42 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:09:42 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:09:42 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:09:42.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:09:42 np0005548916 nova_compute[228576]: 2025-12-06 10:09:42.471 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:09:42 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:09:42 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:09:42 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:09:42.619 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:09:42 np0005548916 podman[233026]: 2025-12-06 10:09:42.777787817 +0000 UTC m=+0.084190518 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec  6 05:09:43 np0005548916 nova_compute[228576]: 2025-12-06 10:09:43.470 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:09:43 np0005548916 nova_compute[228576]: 2025-12-06 10:09:43.495 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:09:43 np0005548916 nova_compute[228576]: 2025-12-06 10:09:43.495 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:09:43 np0005548916 nova_compute[228576]: 2025-12-06 10:09:43.496 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:09:43 np0005548916 nova_compute[228576]: 2025-12-06 10:09:43.496 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  6 05:09:43 np0005548916 nova_compute[228576]: 2025-12-06 10:09:43.496 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:09:43 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:43 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_41] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280030f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:43 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:09:43 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  6 05:09:43 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3709469109' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  6 05:09:43 np0005548916 nova_compute[228576]: 2025-12-06 10:09:43.989 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.493s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:09:44 np0005548916 nova_compute[228576]: 2025-12-06 10:09:44.152 228580 WARNING nova.virt.libvirt.driver [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  6 05:09:44 np0005548916 nova_compute[228576]: 2025-12-06 10:09:44.154 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5211MB free_disk=59.89716339111328GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  6 05:09:44 np0005548916 nova_compute[228576]: 2025-12-06 10:09:44.154 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:09:44 np0005548916 nova_compute[228576]: 2025-12-06 10:09:44.154 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:09:44 np0005548916 nova_compute[228576]: 2025-12-06 10:09:44.256 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  6 05:09:44 np0005548916 nova_compute[228576]: 2025-12-06 10:09:44.256 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  6 05:09:44 np0005548916 nova_compute[228576]: 2025-12-06 10:09:44.273 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:09:44 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:44 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c004dd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:44 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:44 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c004dd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:44 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:09:44 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:09:44 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:09:44.473 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:09:44 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:09:44 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:09:44 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:09:44.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:09:44 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  6 05:09:44 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3647347951' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  6 05:09:44 np0005548916 nova_compute[228576]: 2025-12-06 10:09:44.726 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:09:44 np0005548916 nova_compute[228576]: 2025-12-06 10:09:44.734 228580 DEBUG nova.compute.provider_tree [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Inventory has not changed in ProviderTree for provider: ff2f17cb-ff1d-4da7-9560-4be741380cb1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  6 05:09:44 np0005548916 nova_compute[228576]: 2025-12-06 10:09:44.766 228580 DEBUG nova.scheduler.client.report [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Inventory has not changed for provider ff2f17cb-ff1d-4da7-9560-4be741380cb1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  6 05:09:44 np0005548916 nova_compute[228576]: 2025-12-06 10:09:44.769 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  6 05:09:44 np0005548916 nova_compute[228576]: 2025-12-06 10:09:44.769 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.615s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:09:45 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:45 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800c700 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:45 np0005548916 nova_compute[228576]: 2025-12-06 10:09:45.769 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:09:45 np0005548916 nova_compute[228576]: 2025-12-06 10:09:45.770 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  6 05:09:45 np0005548916 nova_compute[228576]: 2025-12-06 10:09:45.770 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  6 05:09:45 np0005548916 nova_compute[228576]: 2025-12-06 10:09:45.784 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  6 05:09:45 np0005548916 nova_compute[228576]: 2025-12-06 10:09:45.784 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:09:45 np0005548916 nova_compute[228576]: 2025-12-06 10:09:45.784 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:09:45 np0005548916 nova_compute[228576]: 2025-12-06 10:09:45.784 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:09:45 np0005548916 nova_compute[228576]: 2025-12-06 10:09:45.785 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:09:46 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:46 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_41] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280030f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:46 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:46 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c004dd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:46 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:09:46 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:09:46 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:09:46.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:09:46 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:09:46 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:09:46 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:09:46.624 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:09:46 np0005548916 podman[233097]: 2025-12-06 10:09:46.751304543 +0000 UTC m=+0.058584217 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Dec  6 05:09:47 np0005548916 nova_compute[228576]: 2025-12-06 10:09:47.479 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:09:47 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:47 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c004dd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:48 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:48 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800c700 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:48 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:48 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_41] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280030f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:48 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:09:48 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:09:48 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:09:48.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:09:48 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:09:48 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:09:48 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:09:48.627 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:09:48 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:09:49 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:49 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:09:49 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:49 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c004dd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:50 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:50 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_37] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2210003e90 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:50 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:50 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800c700 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:50 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:09:50 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:09:50 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:09:50.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:09:50 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:09:50 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:09:50 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:09:50.630 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:09:51 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:51 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_41] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280030f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:51 np0005548916 podman[233118]: 2025-12-06 10:09:51.752875277 +0000 UTC m=+0.059160291 container health_status ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible)
Dec  6 05:09:52 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:52 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:09:52 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:52 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:09:52 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:52 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c004dd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:52 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:52 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_37] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2210003eb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:52 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:09:52 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:09:52 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:09:52.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:09:52 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:09:52 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:09:52 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:09:52.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:09:53 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:53 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800c700 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:53 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:09:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:09:54.283 141446 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:09:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:09:54.283 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:09:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:09:54.284 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:09:54 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:54 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_41] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280030f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:54 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:54 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c004dd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:54 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:09:54 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:09:54 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:09:54.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:09:54 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:09:54 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:09:54 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:09:54.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:09:55 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:55 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  6 05:09:55 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:55 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c004dd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:56 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:56 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800c700 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:56 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:56 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_41] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280030f0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:56 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:09:56 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:09:56 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:09:56.493 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:09:56 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:09:56 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:09:56 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:09:56.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:09:57 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:57 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c004dd0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:58 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:58 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c004dd0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:58 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:58 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_37] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800c700 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:58 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:09:58 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:09:58 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:09:58.497 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:09:58 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:09:58 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:09:58 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:09:58.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:09:58 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:09:59 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:59 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_41] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280051b0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:00 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:00 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2210003eb0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:00 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:00 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c004dd0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:00 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/101000 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  6 05:10:00 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:10:00 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:10:00 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:10:00.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:10:00 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:10:00 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:10:00 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:10:00.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:10:00 np0005548916 ceph-mon[79770]: Health detail: HEALTH_WARN 1 failed cephadm daemon(s)
Dec  6 05:10:00 np0005548916 ceph-mon[79770]: [WRN] CEPHADM_FAILED_DAEMON: 1 failed cephadm daemon(s)
Dec  6 05:10:00 np0005548916 ceph-mon[79770]:    daemon nfs.cephfs.2.0.compute-0.dfwxck on compute-0 is in unknown state
Dec  6 05:10:01 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:01 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_41] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c004dd0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:02 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:02 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_42] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280051b0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:02 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:02 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224003090 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:02 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:10:02 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:10:02 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:10:02.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:10:02 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:10:02 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:10:02 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:10:02.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:10:03 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:03 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_43] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800c700 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:03 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:10:04 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:04 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_41] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004190 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:04 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:04 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_42] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280051b0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:04 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:10:04 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:10:04 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:10:04.504 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:10:04 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:10:04 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:10:04 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:10:04.650 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:10:05 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:05 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224003090 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:06 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:06 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_43] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800c700 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:06 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:06 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_41] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004190 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:06 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:10:06 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:10:06 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:10:06.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:10:06 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:10:06 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:10:06 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:10:06.654 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:10:07 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:07 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_42] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280051b0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:08 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:08 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224003090 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:08 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:08 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_43] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800c700 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:08 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:10:08 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:10:08 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:10:08.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:10:08 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:10:08 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:10:08 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:10:08.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:10:08 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:10:09 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:09 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_41] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004190 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:10 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:10 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_42] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280051b0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:10 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:10 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224003090 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:10 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:10:10 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:10:10 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:10:10.513 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:10:10 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:10:10 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:10:10 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:10:10.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:10:11 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:11 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_43] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800c700 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:12 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:12 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_41] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004190 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:12 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:12 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_42] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280051b0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:12 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:10:12 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:10:12 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:10:12.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:10:12 np0005548916 ceph-mon[79770]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec  6 05:10:12 np0005548916 ceph-mon[79770]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1800.0 total, 600.0 interval#012Cumulative writes: 5106 writes, 27K keys, 5106 commit groups, 1.0 writes per commit group, ingest: 0.07 GB, 0.04 MB/s#012Cumulative WAL: 5106 writes, 5106 syncs, 1.00 writes per sync, written: 0.07 GB, 0.04 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1498 writes, 7288 keys, 1498 commit groups, 1.0 writes per commit group, ingest: 16.90 MB, 0.03 MB/s#012Interval WAL: 1499 writes, 1499 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0    115.3      0.33              0.15        14    0.024       0      0       0.0       0.0#012  L6      1/0   12.72 MB   0.0      0.2     0.0      0.2       0.2      0.0       0.0   4.4    123.3    107.1      1.58              0.55        13    0.122     67K   6725       0.0       0.0#012 Sum      1/0   12.72 MB   0.0      0.2     0.0      0.2       0.2      0.0       0.0   5.4    101.9    108.5      1.92              0.70        27    0.071     67K   6725       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   7.9    122.3    121.3      0.62              0.19        10    0.062     29K   2587       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.2     0.0      0.2       0.2      0.0       0.0   0.0    123.3    107.1      1.58              0.55        13    0.122     67K   6725       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0    115.9      0.33              0.15        13    0.025       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1800.0 total, 600.0 interval#012Flush(GB): cumulative 0.037, interval 0.009#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.20 GB write, 0.12 MB/s write, 0.19 GB read, 0.11 MB/s read, 1.9 seconds#012Interval compaction: 0.07 GB write, 0.13 MB/s write, 0.07 GB read, 0.13 MB/s read, 0.6 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55fbbecff350#2 capacity: 304.00 MB usage: 13.55 MB table_size: 0 occupancy: 18446744073709551615 collections: 4 last_copies: 0 last_secs: 0.000157 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(720,13.00 MB,4.27649%) FilterBlock(27,201.92 KB,0.0648649%) IndexBlock(27,355.77 KB,0.114285%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Dec  6 05:10:12 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:10:12 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:10:12 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:10:12.663 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:10:13 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:13 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224003090 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:13 np0005548916 podman[233202]: 2025-12-06 10:10:13.790397615 +0000 UTC m=+0.100977973 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec  6 05:10:13 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:10:14 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:14 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_43] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800c700 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:14 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:14 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_41] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004190 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:14 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:10:14 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:10:14 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:10:14.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:10:14 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:10:14 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:10:14 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:10:14.665 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:10:15 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:15 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_42] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280051b0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:16 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:16 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224003090 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:16 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:16 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_43] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800c700 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:16 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:10:16 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:10:16 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:10:16.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:10:16 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:10:16 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:10:16 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:10:16.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:10:17 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:17 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_41] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004190 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:17 np0005548916 podman[233230]: 2025-12-06 10:10:17.750352657 +0000 UTC m=+0.060728760 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec  6 05:10:18 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:18 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_42] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280051b0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:18 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:18 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224003090 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:18 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:10:18 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:10:18 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:10:18.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:10:18 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:10:18 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:10:18 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:10:18.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:10:18 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:10:19 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:19 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_43] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800c700 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:20 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:10:20.402 141446 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=8, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '9a:dc:0d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'b6:0a:c4:b8:be:39'}, ipsec=False) old=SB_Global(nb_cfg=7) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  6 05:10:20 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:10:20.403 141446 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  6 05:10:20 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:20 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_41] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004190 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:20 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:20 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_42] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280051b0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:20 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:10:20 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:10:20 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:10:20.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:10:20 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:10:20 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:10:20 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:10:20.676 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:10:21 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:10:21.405 141446 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=61eba479-a995-4b31-88b9-8ebfcea9907e, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '8'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  6 05:10:21 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:21 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224003090 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:22 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:22 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_43] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800c700 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:22 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:22 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_41] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004190 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:22 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:10:22 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:10:22 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:10:22.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:10:22 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:10:22 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:10:22 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:10:22.678 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:10:22 np0005548916 podman[233252]: 2025-12-06 10:10:22.768741386 +0000 UTC m=+0.069533917 container health_status ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_id=multipathd)
Dec  6 05:10:23 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:23 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_42] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280051b0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:23 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:10:23 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:10:23 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:10:24 np0005548916 podman[233443]: 2025-12-06 10:10:24.30445373 +0000 UTC m=+0.050048646 container create 22f9a27de4233a0ac880f52c9cb8e2fc0b2d22c92029a97603240176e72c960f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=zealous_brown, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid)
Dec  6 05:10:24 np0005548916 systemd[1]: Started libpod-conmon-22f9a27de4233a0ac880f52c9cb8e2fc0b2d22c92029a97603240176e72c960f.scope.
Dec  6 05:10:24 np0005548916 podman[233443]: 2025-12-06 10:10:24.281607246 +0000 UTC m=+0.027202172 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  6 05:10:24 np0005548916 systemd[1]: Started libcrun container.
Dec  6 05:10:24 np0005548916 podman[233443]: 2025-12-06 10:10:24.414048734 +0000 UTC m=+0.159643690 container init 22f9a27de4233a0ac880f52c9cb8e2fc0b2d22c92029a97603240176e72c960f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=zealous_brown, OSD_FLAVOR=default, org.label-schema.build-date=20250325, CEPH_REF=squid, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  6 05:10:24 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:24 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224003090 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:24 np0005548916 podman[233443]: 2025-12-06 10:10:24.425401284 +0000 UTC m=+0.170996220 container start 22f9a27de4233a0ac880f52c9cb8e2fc0b2d22c92029a97603240176e72c960f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=zealous_brown, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, OSD_FLAVOR=default, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Dec  6 05:10:24 np0005548916 podman[233443]: 2025-12-06 10:10:24.429467345 +0000 UTC m=+0.175062251 container attach 22f9a27de4233a0ac880f52c9cb8e2fc0b2d22c92029a97603240176e72c960f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=zealous_brown, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec  6 05:10:24 np0005548916 zealous_brown[233460]: 167 167
Dec  6 05:10:24 np0005548916 systemd[1]: libpod-22f9a27de4233a0ac880f52c9cb8e2fc0b2d22c92029a97603240176e72c960f.scope: Deactivated successfully.
Dec  6 05:10:24 np0005548916 podman[233443]: 2025-12-06 10:10:24.434587871 +0000 UTC m=+0.180182757 container died 22f9a27de4233a0ac880f52c9cb8e2fc0b2d22c92029a97603240176e72c960f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=zealous_brown, OSD_FLAVOR=default, org.label-schema.build-date=20250325, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  6 05:10:24 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:24 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_43] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800c700 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:24 np0005548916 systemd[1]: var-lib-containers-storage-overlay-69fc32efcb6bbf208d174d783e70de0b8b71e32cb94b90766027937fb6de5c49-merged.mount: Deactivated successfully.
Dec  6 05:10:24 np0005548916 podman[233443]: 2025-12-06 10:10:24.477069129 +0000 UTC m=+0.222664025 container remove 22f9a27de4233a0ac880f52c9cb8e2fc0b2d22c92029a97603240176e72c960f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=zealous_brown, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid)
Dec  6 05:10:24 np0005548916 systemd[1]: libpod-conmon-22f9a27de4233a0ac880f52c9cb8e2fc0b2d22c92029a97603240176e72c960f.scope: Deactivated successfully.
Dec  6 05:10:24 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:10:24 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:10:24 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:10:24.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:10:24 np0005548916 podman[233484]: 2025-12-06 10:10:24.644951932 +0000 UTC m=+0.049219746 container create 27b263c71547c71aba5e2f75e94c35a7a05afaacfe0002019ea7a189384972e8 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=laughing_clarke, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS)
Dec  6 05:10:24 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:10:24 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:10:24 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:10:24.682 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:10:24 np0005548916 systemd[1]: Started libpod-conmon-27b263c71547c71aba5e2f75e94c35a7a05afaacfe0002019ea7a189384972e8.scope.
Dec  6 05:10:24 np0005548916 podman[233484]: 2025-12-06 10:10:24.62097504 +0000 UTC m=+0.025242864 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  6 05:10:24 np0005548916 systemd[1]: Started libcrun container.
Dec  6 05:10:24 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5c41c57e792e340df659c20a30f2c7cf8d8c4bb4de71f57b4504afb46e76cbc6/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  6 05:10:24 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5c41c57e792e340df659c20a30f2c7cf8d8c4bb4de71f57b4504afb46e76cbc6/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  6 05:10:24 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5c41c57e792e340df659c20a30f2c7cf8d8c4bb4de71f57b4504afb46e76cbc6/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  6 05:10:24 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5c41c57e792e340df659c20a30f2c7cf8d8c4bb4de71f57b4504afb46e76cbc6/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  6 05:10:24 np0005548916 podman[233484]: 2025-12-06 10:10:24.757112749 +0000 UTC m=+0.161380573 container init 27b263c71547c71aba5e2f75e94c35a7a05afaacfe0002019ea7a189384972e8 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=laughing_clarke, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  6 05:10:24 np0005548916 podman[233484]: 2025-12-06 10:10:24.766502091 +0000 UTC m=+0.170769905 container start 27b263c71547c71aba5e2f75e94c35a7a05afaacfe0002019ea7a189384972e8 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=laughing_clarke, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, ceph=True, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default)
Dec  6 05:10:24 np0005548916 podman[233484]: 2025-12-06 10:10:24.770182442 +0000 UTC m=+0.174450276 container attach 27b263c71547c71aba5e2f75e94c35a7a05afaacfe0002019ea7a189384972e8 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=laughing_clarke, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  6 05:10:25 np0005548916 laughing_clarke[233500]: [
Dec  6 05:10:25 np0005548916 laughing_clarke[233500]:    {
Dec  6 05:10:25 np0005548916 laughing_clarke[233500]:        "available": false,
Dec  6 05:10:25 np0005548916 laughing_clarke[233500]:        "being_replaced": false,
Dec  6 05:10:25 np0005548916 laughing_clarke[233500]:        "ceph_device_lvm": false,
Dec  6 05:10:25 np0005548916 laughing_clarke[233500]:        "device_id": "QEMU_DVD-ROM_QM00001",
Dec  6 05:10:25 np0005548916 laughing_clarke[233500]:        "lsm_data": {},
Dec  6 05:10:25 np0005548916 laughing_clarke[233500]:        "lvs": [],
Dec  6 05:10:25 np0005548916 laughing_clarke[233500]:        "path": "/dev/sr0",
Dec  6 05:10:25 np0005548916 laughing_clarke[233500]:        "rejected_reasons": [
Dec  6 05:10:25 np0005548916 laughing_clarke[233500]:            "Insufficient space (<5GB)",
Dec  6 05:10:25 np0005548916 laughing_clarke[233500]:            "Has a FileSystem"
Dec  6 05:10:25 np0005548916 laughing_clarke[233500]:        ],
Dec  6 05:10:25 np0005548916 laughing_clarke[233500]:        "sys_api": {
Dec  6 05:10:25 np0005548916 laughing_clarke[233500]:            "actuators": null,
Dec  6 05:10:25 np0005548916 laughing_clarke[233500]:            "device_nodes": [
Dec  6 05:10:25 np0005548916 laughing_clarke[233500]:                "sr0"
Dec  6 05:10:25 np0005548916 laughing_clarke[233500]:            ],
Dec  6 05:10:25 np0005548916 laughing_clarke[233500]:            "devname": "sr0",
Dec  6 05:10:25 np0005548916 laughing_clarke[233500]:            "human_readable_size": "482.00 KB",
Dec  6 05:10:25 np0005548916 laughing_clarke[233500]:            "id_bus": "ata",
Dec  6 05:10:25 np0005548916 laughing_clarke[233500]:            "model": "QEMU DVD-ROM",
Dec  6 05:10:25 np0005548916 laughing_clarke[233500]:            "nr_requests": "2",
Dec  6 05:10:25 np0005548916 laughing_clarke[233500]:            "parent": "/dev/sr0",
Dec  6 05:10:25 np0005548916 laughing_clarke[233500]:            "partitions": {},
Dec  6 05:10:25 np0005548916 laughing_clarke[233500]:            "path": "/dev/sr0",
Dec  6 05:10:25 np0005548916 laughing_clarke[233500]:            "removable": "1",
Dec  6 05:10:25 np0005548916 laughing_clarke[233500]:            "rev": "2.5+",
Dec  6 05:10:25 np0005548916 laughing_clarke[233500]:            "ro": "0",
Dec  6 05:10:25 np0005548916 laughing_clarke[233500]:            "rotational": "1",
Dec  6 05:10:25 np0005548916 laughing_clarke[233500]:            "sas_address": "",
Dec  6 05:10:25 np0005548916 laughing_clarke[233500]:            "sas_device_handle": "",
Dec  6 05:10:25 np0005548916 laughing_clarke[233500]:            "scheduler_mode": "mq-deadline",
Dec  6 05:10:25 np0005548916 laughing_clarke[233500]:            "sectors": 0,
Dec  6 05:10:25 np0005548916 laughing_clarke[233500]:            "sectorsize": "2048",
Dec  6 05:10:25 np0005548916 laughing_clarke[233500]:            "size": 493568.0,
Dec  6 05:10:25 np0005548916 laughing_clarke[233500]:            "support_discard": "2048",
Dec  6 05:10:25 np0005548916 laughing_clarke[233500]:            "type": "disk",
Dec  6 05:10:25 np0005548916 laughing_clarke[233500]:            "vendor": "QEMU"
Dec  6 05:10:25 np0005548916 laughing_clarke[233500]:        }
Dec  6 05:10:25 np0005548916 laughing_clarke[233500]:    }
Dec  6 05:10:25 np0005548916 laughing_clarke[233500]: ]
Dec  6 05:10:25 np0005548916 systemd[1]: libpod-27b263c71547c71aba5e2f75e94c35a7a05afaacfe0002019ea7a189384972e8.scope: Deactivated successfully.
Dec  6 05:10:25 np0005548916 podman[233484]: 2025-12-06 10:10:25.551391789 +0000 UTC m=+0.955659593 container died 27b263c71547c71aba5e2f75e94c35a7a05afaacfe0002019ea7a189384972e8 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=laughing_clarke, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, OSD_FLAVOR=default)
Dec  6 05:10:25 np0005548916 systemd[1]: var-lib-containers-storage-overlay-5c41c57e792e340df659c20a30f2c7cf8d8c4bb4de71f57b4504afb46e76cbc6-merged.mount: Deactivated successfully.
Dec  6 05:10:25 np0005548916 podman[233484]: 2025-12-06 10:10:25.598788508 +0000 UTC m=+1.003056352 container remove 27b263c71547c71aba5e2f75e94c35a7a05afaacfe0002019ea7a189384972e8 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=laughing_clarke, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  6 05:10:25 np0005548916 systemd[1]: libpod-conmon-27b263c71547c71aba5e2f75e94c35a7a05afaacfe0002019ea7a189384972e8.scope: Deactivated successfully.
Dec  6 05:10:25 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:25 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_41] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004190 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:25 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:10:25 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:10:26 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:26 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_42] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280051b0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:26 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:26 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224003090 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:26 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:10:26 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:10:26 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:10:26.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:10:26 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:10:26 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:10:26 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:10:26.685 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:10:27 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:27 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_43] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800c700 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:27 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:10:27 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:10:27 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  6 05:10:27 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:10:27 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:10:27 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  6 05:10:28 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:28 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_41] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004190 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:28 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:28 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_42] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280051b0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:28 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:10:28 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:10:28 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:10:28.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:10:28 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:10:28 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:10:28 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:10:28.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:10:28 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:10:29 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:29 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_43] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280051b0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:30 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:30 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_43] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800c700 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:30 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:30 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_41] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004190 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:30 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:10:30 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:10:30 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:10:30.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:10:30 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:10:30 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:10:30 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:10:30.691 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:10:31 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:31 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004190 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:31 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:10:31 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:10:32 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:32 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_43] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004190 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:32 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:32 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_42] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2210002830 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:32 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:10:32 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:10:32 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:10:32.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:10:32 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:10:32 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:10:32 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:10:32.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:10:33 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:33 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_44] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280051b0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:33 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:10:34 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:34 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224003090 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:34 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:34 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_43] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003630 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:34 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:10:34 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:10:34 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:10:34.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:10:34 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:10:34 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:10:34 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:10:34.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:10:35 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:35 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004190 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:36 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:36 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_42] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280051b0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:36 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:36 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_43] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280051b0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:36 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:10:36 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:10:36 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:10:36.552 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:10:36 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:10:36 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:10:36 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:10:36.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:10:37 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:37 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_43] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003630 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:38 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:38 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004190 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:38 np0005548916 nova_compute[228576]: 2025-12-06 10:10:38.471 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:10:38 np0005548916 nova_compute[228576]: 2025-12-06 10:10:38.472 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  6 05:10:38 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:38 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_42] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280051b0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:38 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:10:38 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:10:38 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:10:38.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:10:38 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:10:38 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:10:38 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:10:38.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:10:38 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:10:39 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:39 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_42] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280051b0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:40 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:40 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003630 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:40 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:40 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224003090 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:40 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:10:40 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:10:40 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:10:40.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:10:40 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:10:40 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:10:40 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:10:40.706 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:10:41 np0005548916 nova_compute[228576]: 2025-12-06 10:10:41.470 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:10:41 np0005548916 nova_compute[228576]: 2025-12-06 10:10:41.471 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Dec  6 05:10:41 np0005548916 nova_compute[228576]: 2025-12-06 10:10:41.489 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Dec  6 05:10:41 np0005548916 nova_compute[228576]: 2025-12-06 10:10:41.489 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:10:41 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:41 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_42] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224003090 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:42 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:42 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_42] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280051b0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:42 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:42 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003630 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:42 np0005548916 nova_compute[228576]: 2025-12-06 10:10:42.499 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:10:42 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:10:42 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:10:42 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:10:42.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:10:42 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:10:42 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:10:42 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:10:42.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:10:43 np0005548916 nova_compute[228576]: 2025-12-06 10:10:43.471 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:10:43 np0005548916 nova_compute[228576]: 2025-12-06 10:10:43.471 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Dec  6 05:10:43 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:43 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_43] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003630 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:43 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:10:44 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:44 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_42] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003630 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:44 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:44 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280051b0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:44 np0005548916 nova_compute[228576]: 2025-12-06 10:10:44.501 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:10:44 np0005548916 nova_compute[228576]: 2025-12-06 10:10:44.526 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:10:44 np0005548916 nova_compute[228576]: 2025-12-06 10:10:44.526 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  6 05:10:44 np0005548916 nova_compute[228576]: 2025-12-06 10:10:44.526 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  6 05:10:44 np0005548916 nova_compute[228576]: 2025-12-06 10:10:44.552 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  6 05:10:44 np0005548916 nova_compute[228576]: 2025-12-06 10:10:44.552 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:10:44 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:10:44 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:10:44 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:10:44.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:10:44 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:10:44 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:10:44 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:10:44.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:10:44 np0005548916 podman[234701]: 2025-12-06 10:10:44.822290438 +0000 UTC m=+0.118230618 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Dec  6 05:10:45 np0005548916 nova_compute[228576]: 2025-12-06 10:10:45.470 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:10:45 np0005548916 nova_compute[228576]: 2025-12-06 10:10:45.471 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:10:45 np0005548916 nova_compute[228576]: 2025-12-06 10:10:45.471 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:10:45 np0005548916 nova_compute[228576]: 2025-12-06 10:10:45.492 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:10:45 np0005548916 nova_compute[228576]: 2025-12-06 10:10:45.493 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:10:45 np0005548916 nova_compute[228576]: 2025-12-06 10:10:45.493 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:10:45 np0005548916 nova_compute[228576]: 2025-12-06 10:10:45.493 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  6 05:10:45 np0005548916 nova_compute[228576]: 2025-12-06 10:10:45.494 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:10:45 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:45 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224003090 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:45 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  6 05:10:45 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2163291013' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  6 05:10:45 np0005548916 nova_compute[228576]: 2025-12-06 10:10:45.993 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.498s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:10:46 np0005548916 nova_compute[228576]: 2025-12-06 10:10:46.170 228580 WARNING nova.virt.libvirt.driver [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  6 05:10:46 np0005548916 nova_compute[228576]: 2025-12-06 10:10:46.171 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5207MB free_disk=59.94276428222656GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  6 05:10:46 np0005548916 nova_compute[228576]: 2025-12-06 10:10:46.172 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:10:46 np0005548916 nova_compute[228576]: 2025-12-06 10:10:46.172 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:10:46 np0005548916 nova_compute[228576]: 2025-12-06 10:10:46.328 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  6 05:10:46 np0005548916 nova_compute[228576]: 2025-12-06 10:10:46.329 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  6 05:10:46 np0005548916 nova_compute[228576]: 2025-12-06 10:10:46.405 228580 DEBUG nova.scheduler.client.report [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Refreshing inventories for resource provider ff2f17cb-ff1d-4da7-9560-4be741380cb1 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Dec  6 05:10:46 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:46 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_43] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004190 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:46 np0005548916 nova_compute[228576]: 2025-12-06 10:10:46.467 228580 DEBUG nova.scheduler.client.report [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Updating ProviderTree inventory for provider ff2f17cb-ff1d-4da7-9560-4be741380cb1 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Dec  6 05:10:46 np0005548916 nova_compute[228576]: 2025-12-06 10:10:46.468 228580 DEBUG nova.compute.provider_tree [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Updating inventory in ProviderTree for provider ff2f17cb-ff1d-4da7-9560-4be741380cb1 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Dec  6 05:10:46 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:46 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_42] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003630 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:46 np0005548916 nova_compute[228576]: 2025-12-06 10:10:46.483 228580 DEBUG nova.scheduler.client.report [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Refreshing aggregate associations for resource provider ff2f17cb-ff1d-4da7-9560-4be741380cb1, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Dec  6 05:10:46 np0005548916 nova_compute[228576]: 2025-12-06 10:10:46.513 228580 DEBUG nova.scheduler.client.report [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Refreshing trait associations for resource provider ff2f17cb-ff1d-4da7-9560-4be741380cb1, traits: COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_ACCELERATORS,HW_CPU_X86_SVM,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_AESNI,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSSE3,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSE4A,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_FMA3,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_AVX,HW_CPU_X86_BMI2,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SHA,HW_CPU_X86_SSE2,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_AMD_SVM,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_F16C,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_ABM,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NODE,COMPUTE_SECURITY_TPM_2_0,COMPUTE_RESCUE_BFV,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SECURITY_TPM_1_2,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_CLMUL,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_AVX2,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_BMI,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_SOCKET_PCI_NUMA_AFFINITY _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Dec  6 05:10:46 np0005548916 nova_compute[228576]: 2025-12-06 10:10:46.531 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:10:46 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:10:46 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:10:46 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:10:46.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:10:46 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:10:46 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:10:46 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:10:46.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:10:47 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  6 05:10:47 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2493443694' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  6 05:10:47 np0005548916 nova_compute[228576]: 2025-12-06 10:10:47.040 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.508s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:10:47 np0005548916 nova_compute[228576]: 2025-12-06 10:10:47.045 228580 DEBUG nova.compute.provider_tree [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Inventory has not changed in ProviderTree for provider: ff2f17cb-ff1d-4da7-9560-4be741380cb1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  6 05:10:47 np0005548916 nova_compute[228576]: 2025-12-06 10:10:47.192 228580 DEBUG nova.scheduler.client.report [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Inventory has not changed for provider ff2f17cb-ff1d-4da7-9560-4be741380cb1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  6 05:10:47 np0005548916 nova_compute[228576]: 2025-12-06 10:10:47.194 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  6 05:10:47 np0005548916 nova_compute[228576]: 2025-12-06 10:10:47.194 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.022s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:10:47 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:47 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280051b0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:48 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:48 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224003090 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:48 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:48 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_43] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004190 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:48 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:10:48 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:10:48 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:10:48.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:10:48 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:10:48 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:10:48 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:10:48.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:10:48 np0005548916 podman[234774]: 2025-12-06 10:10:48.745564925 +0000 UTC m=+0.053337977 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Dec  6 05:10:48 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:10:49 np0005548916 nova_compute[228576]: 2025-12-06 10:10:49.187 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:10:49 np0005548916 nova_compute[228576]: 2025-12-06 10:10:49.188 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:10:49 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:49 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_42] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003630 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:50 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:50 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280051b0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:50 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:50 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224003090 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:50 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:10:50 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:10:50 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:10:50.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:10:50 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:10:50 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:10:50 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:10:50.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:10:51 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:51 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_42] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224003090 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:52 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:52 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_42] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003630 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:52 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:52 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280051b0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:52 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:10:52 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:10:52 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:10:52.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:10:52 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:10:52 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:10:52 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:10:52.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:10:52 np0005548916 podman[234819]: 2025-12-06 10:10:52.877695526 +0000 UTC m=+0.065880407 container health_status ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec  6 05:10:53 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:53 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_43] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280051b0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:53 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:10:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:10:54.284 141446 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:10:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:10:54.285 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:10:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:10:54.285 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:10:54 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:54 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_43] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280051b0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:54 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:54 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_43] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280051b0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:54 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:10:54 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:10:54 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:10:54.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:10:54 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:10:54 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:10:54 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:10:54.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:10:54 np0005548916 ceph-mon[79770]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #52. Immutable memtables: 0.
Dec  6 05:10:54 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:10:54.987366) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec  6 05:10:54 np0005548916 ceph-mon[79770]: rocksdb: [db/flush_job.cc:856] [default] [JOB 29] Flushing memtable with next log file: 52
Dec  6 05:10:54 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015854987601, "job": 29, "event": "flush_started", "num_memtables": 1, "num_entries": 2371, "num_deletes": 251, "total_data_size": 6486955, "memory_usage": 6571544, "flush_reason": "Manual Compaction"}
Dec  6 05:10:54 np0005548916 ceph-mon[79770]: rocksdb: [db/flush_job.cc:885] [default] [JOB 29] Level-0 flush table #53: started
Dec  6 05:10:55 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015855016882, "cf_name": "default", "job": 29, "event": "table_file_creation", "file_number": 53, "file_size": 4184524, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 25976, "largest_seqno": 28342, "table_properties": {"data_size": 4174854, "index_size": 6100, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2501, "raw_key_size": 20016, "raw_average_key_size": 20, "raw_value_size": 4155559, "raw_average_value_size": 4236, "num_data_blocks": 267, "num_entries": 981, "num_filter_entries": 981, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765015644, "oldest_key_time": 1765015644, "file_creation_time": 1765015854, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79d3ff52-4bd2-4fdc-8b55-33dd9c53d8e0", "db_session_id": "1TK25AVRA1WQDS4JHM8T", "orig_file_number": 53, "seqno_to_time_mapping": "N/A"}}
Dec  6 05:10:55 np0005548916 ceph-mon[79770]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 29] Flush lasted 29530 microseconds, and 12427 cpu microseconds.
Dec  6 05:10:55 np0005548916 ceph-mon[79770]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  6 05:10:55 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:10:55.016949) [db/flush_job.cc:967] [default] [JOB 29] Level-0 flush table #53: 4184524 bytes OK
Dec  6 05:10:55 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:10:55.016975) [db/memtable_list.cc:519] [default] Level-0 commit table #53 started
Dec  6 05:10:55 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:10:55.018898) [db/memtable_list.cc:722] [default] Level-0 commit table #53: memtable #1 done
Dec  6 05:10:55 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:10:55.018915) EVENT_LOG_v1 {"time_micros": 1765015855018909, "job": 29, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec  6 05:10:55 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:10:55.018934) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec  6 05:10:55 np0005548916 ceph-mon[79770]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 29] Try to delete WAL files size 6476383, prev total WAL file size 6476383, number of live WAL files 2.
Dec  6 05:10:55 np0005548916 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000049.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 05:10:55 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:10:55.020528) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730032303038' seq:72057594037927935, type:22 .. '7061786F730032323630' seq:0, type:0; will stop at (end)
Dec  6 05:10:55 np0005548916 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 30] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec  6 05:10:55 np0005548916 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 29 Base level 0, inputs: [53(4086KB)], [51(12MB)]
Dec  6 05:10:55 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015855020668, "job": 30, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [53], "files_L6": [51], "score": -1, "input_data_size": 17518729, "oldest_snapshot_seqno": -1}
Dec  6 05:10:55 np0005548916 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 30] Generated table #54: 5894 keys, 15448376 bytes, temperature: kUnknown
Dec  6 05:10:55 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015855125556, "cf_name": "default", "job": 30, "event": "table_file_creation", "file_number": 54, "file_size": 15448376, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 15407601, "index_size": 24921, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 14789, "raw_key_size": 149731, "raw_average_key_size": 25, "raw_value_size": 15299586, "raw_average_value_size": 2595, "num_data_blocks": 1018, "num_entries": 5894, "num_filter_entries": 5894, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765014012, "oldest_key_time": 0, "file_creation_time": 1765015855, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79d3ff52-4bd2-4fdc-8b55-33dd9c53d8e0", "db_session_id": "1TK25AVRA1WQDS4JHM8T", "orig_file_number": 54, "seqno_to_time_mapping": "N/A"}}
Dec  6 05:10:55 np0005548916 ceph-mon[79770]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  6 05:10:55 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:10:55.126197) [db/compaction/compaction_job.cc:1663] [default] [JOB 30] Compacted 1@0 + 1@6 files to L6 => 15448376 bytes
Dec  6 05:10:55 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:10:55.128561) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 166.8 rd, 147.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(4.0, 12.7 +0.0 blob) out(14.7 +0.0 blob), read-write-amplify(7.9) write-amplify(3.7) OK, records in: 6414, records dropped: 520 output_compression: NoCompression
Dec  6 05:10:55 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:10:55.128599) EVENT_LOG_v1 {"time_micros": 1765015855128582, "job": 30, "event": "compaction_finished", "compaction_time_micros": 105011, "compaction_time_cpu_micros": 35598, "output_level": 6, "num_output_files": 1, "total_output_size": 15448376, "num_input_records": 6414, "num_output_records": 5894, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec  6 05:10:55 np0005548916 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000053.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 05:10:55 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015855130358, "job": 30, "event": "table_file_deletion", "file_number": 53}
Dec  6 05:10:55 np0005548916 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000051.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 05:10:55 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015855135233, "job": 30, "event": "table_file_deletion", "file_number": 51}
Dec  6 05:10:55 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:10:55.020397) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:10:55 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:10:55.135403) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:10:55 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:10:55.135414) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:10:55 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:10:55.135416) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:10:55 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:10:55.135418) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:10:55 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:10:55.135419) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:10:55 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:55 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_43] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c003370 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:56 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:56 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_47] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003630 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:56 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:56 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280051b0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:56 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:10:56 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:10:56 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:10:56.585 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:10:56 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:10:56 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:10:56 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:10:56.729 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:10:57 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:57 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_43] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280051b0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:58 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:58 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_43] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c003370 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:58 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:58 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_47] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003630 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:58 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:10:58 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:10:58 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:10:58.589 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:10:58 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:10:58 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.002000049s ======
Dec  6 05:10:58 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:10:58.733 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000049s
Dec  6 05:10:58 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:10:59 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:59 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280051b0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:00 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:00 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224005260 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:00 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:00 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_43] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c003370 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:00 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:11:00 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:11:00 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:11:00.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:11:00 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:11:00 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:11:00 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:11:00.739 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:11:00 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  6 05:11:00 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/938827649' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  6 05:11:01 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:01 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_47] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003630 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:02 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:02 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_47] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280051b0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:02 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:02 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003630 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:02 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:11:02 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:11:02 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:11:02.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:11:02 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:11:02 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:11:02 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:11:02.741 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:11:03 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:03 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224005260 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:03 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:11:04 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:04 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_43] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c003390 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:04 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:04 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c003390 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:04 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:11:04 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:11:04 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:11:04.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:11:04 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:11:04 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:11:04 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:11:04.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:11:05 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:05 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003630 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:06 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:06 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224005260 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:06 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:06 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_43] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c003390 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:06 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:11:06 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:11:06 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:11:06.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:11:06 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:11:06 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:11:06 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:11:06.748 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:11:07 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:07 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_47] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280051b0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:08 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:08 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003630 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:08 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:08 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224005280 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:08 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:11:08 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:11:08 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:11:08.604 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:11:08 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:11:08 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:11:08 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:11:08.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:11:08 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:11:09 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:09 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_43] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c004910 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:10 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:10 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_47] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280051b0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:10 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:10 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003630 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:10 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:11:10 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:11:10 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:11:10.607 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:11:10 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:11:10 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:11:10 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:11:10.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:11:11 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:11 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22240052a0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:12 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:12 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_43] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c004910 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:12 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:12 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_47] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280051b0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:12 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:11:12 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:11:12 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:11:12.610 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:11:12 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:11:12 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:11:12 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:11:12.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:11:13 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:13 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003630 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:13 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:11:14 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:14 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22240052c0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:14 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:14 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_43] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c004910 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:14 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:11:14 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:11:14 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:11:14.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:11:14 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:11:14 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:11:14 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:11:14.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:11:15 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:15 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_47] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280051b0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:15 np0005548916 podman[234879]: 2025-12-06 10:11:15.795649688 +0000 UTC m=+0.095409516 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true)
Dec  6 05:11:16 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:16 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003630 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:16 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:16 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22240052e0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:16 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:11:16 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:11:16 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:11:16.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:11:16 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:11:16 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:11:16 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:11:16.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:11:17 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:17 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_43] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c004910 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:18 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:18 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_47] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003630 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:18 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:18 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280051b0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:18 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:11:18 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:11:18 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:11:18.619 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:11:18 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:11:18 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:11:18 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:11:18.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:11:18 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:11:19 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:19 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224005300 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:19 np0005548916 podman[234908]: 2025-12-06 10:11:19.738435396 +0000 UTC m=+0.050072637 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent)
Dec  6 05:11:20 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:20 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_47] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224005300 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:20 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:20 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_47] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003630 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:20 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:11:20 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:11:20 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:11:20.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:11:20 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:11:20 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:11:20 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:11:20.771 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:11:21 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:21 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280051b0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:22 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:22 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224005300 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:22 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:22 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_43] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c004910 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:22 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:11:22 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:11:22 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:11:22.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:11:22 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:11:22 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:11:22 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:11:22.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:11:23 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:23 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_47] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003630 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:23 np0005548916 podman[234929]: 2025-12-06 10:11:23.764809697 +0000 UTC m=+0.064159704 container health_status ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd)
Dec  6 05:11:23 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:11:24 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:24 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280051b0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:24 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:24 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224005320 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:24 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:11:24 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:11:24 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:11:24.628 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:11:24 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:11:24 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:11:24 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:11:24.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:11:25 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:25 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_47] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224005320 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:26 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:26 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_48] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003630 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:26 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:26 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218002df0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:26 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:11:26 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:11:26 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:11:26.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:11:26 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:11:26 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:11:26 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:11:26.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:11:27 np0005548916 nova_compute[228576]: 2025-12-06 10:11:27.481 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:11:27 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:27 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_50] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280051b0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:28 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:28 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2210002830 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:28 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:28 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_48] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003630 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:28 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:11:28 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:11:28 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:11:28.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:11:28 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:11:28 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:11:28 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:11:28.784 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:11:28 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:11:29 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:29 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218002df0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:30 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:30 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_50] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280051b0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:30 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:30 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2210002830 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:30 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:11:30 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:11:30 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:11:30.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:11:30 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:11:30 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:11:30 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:11:30.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:11:31 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:31 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_48] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003630 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:32 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:32 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218002df0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:32 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:32 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_50] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280051b0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:32 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:11:32 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:11:32 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:11:32.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:11:32 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:11:32 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:11:32 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:11:32.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:11:33 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:33 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_50] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2210002830 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:33 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:11:34 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:34 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_48] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003630 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:34 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:34 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280051b0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:34 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:11:34 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:11:34 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:11:34.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:11:34 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:11:34 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:11:34 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:11:34.793 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:11:35 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:11:35 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:11:35 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  6 05:11:35 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:11:35 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:11:35 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  6 05:11:35 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:35 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218002df0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:36 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:36 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_50] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2210002830 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:36 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:36 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_48] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003630 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:36 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:11:36 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:11:36 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:11:36.648 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:11:36 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:11:36 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:11:36 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:11:36.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:11:36 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:11:36.847 141446 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=9, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '9a:dc:0d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'b6:0a:c4:b8:be:39'}, ipsec=False) old=SB_Global(nb_cfg=8) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  6 05:11:36 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:11:36.848 141446 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  6 05:11:37 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:37 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280051b0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:37 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:11:37.851 141446 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=61eba479-a995-4b31-88b9-8ebfcea9907e, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '9'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  6 05:11:38 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:38 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218002df0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:38 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:38 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_50] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22100035d0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:38 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:11:38 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:11:38 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:11:38.652 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:11:38 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:11:38 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:11:38 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:11:38.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:11:38 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:11:39 np0005548916 nova_compute[228576]: 2025-12-06 10:11:39.469 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:11:39 np0005548916 nova_compute[228576]: 2025-12-06 10:11:39.470 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  6 05:11:39 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:39 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_48] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003630 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:39 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:11:39 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:11:40 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:40 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280051b0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:40 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:40 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218002df0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:40 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:11:40 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:11:40 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:11:40.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:11:40 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:11:40 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:11:40 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:11:40.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:11:41 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:41 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_50] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22100035d0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:42 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:42 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_48] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003630 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:42 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:42 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280051b0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:42 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:11:42 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:11:42 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:11:42.659 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:11:42 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:11:42 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:11:42 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:11:42.808 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:11:43 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:43 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_50] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280051b0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:43 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:11:44 np0005548916 nova_compute[228576]: 2025-12-06 10:11:44.471 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:11:44 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:44 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_50] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22100035d0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:44 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:44 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_48] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003630 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:44 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:11:44 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:11:44 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:11:44.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:11:44 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:11:44 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:11:44 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:11:44.812 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:11:45 np0005548916 nova_compute[228576]: 2025-12-06 10:11:45.470 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:11:45 np0005548916 nova_compute[228576]: 2025-12-06 10:11:45.471 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  6 05:11:45 np0005548916 nova_compute[228576]: 2025-12-06 10:11:45.471 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  6 05:11:45 np0005548916 nova_compute[228576]: 2025-12-06 10:11:45.489 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  6 05:11:45 np0005548916 nova_compute[228576]: 2025-12-06 10:11:45.490 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:11:45 np0005548916 nova_compute[228576]: 2025-12-06 10:11:45.490 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:11:45 np0005548916 nova_compute[228576]: 2025-12-06 10:11:45.490 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:11:45 np0005548916 nova_compute[228576]: 2025-12-06 10:11:45.524 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:11:45 np0005548916 nova_compute[228576]: 2025-12-06 10:11:45.525 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:11:45 np0005548916 nova_compute[228576]: 2025-12-06 10:11:45.525 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:11:45 np0005548916 nova_compute[228576]: 2025-12-06 10:11:45.525 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  6 05:11:45 np0005548916 nova_compute[228576]: 2025-12-06 10:11:45.525 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:11:45 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:45 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003630 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:45 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  6 05:11:45 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/343641707' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  6 05:11:45 np0005548916 nova_compute[228576]: 2025-12-06 10:11:45.977 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.452s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:11:46 np0005548916 nova_compute[228576]: 2025-12-06 10:11:46.205 228580 WARNING nova.virt.libvirt.driver [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  6 05:11:46 np0005548916 nova_compute[228576]: 2025-12-06 10:11:46.206 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5209MB free_disk=59.942543029785156GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  6 05:11:46 np0005548916 nova_compute[228576]: 2025-12-06 10:11:46.207 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:11:46 np0005548916 nova_compute[228576]: 2025-12-06 10:11:46.207 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:11:46 np0005548916 nova_compute[228576]: 2025-12-06 10:11:46.291 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  6 05:11:46 np0005548916 nova_compute[228576]: 2025-12-06 10:11:46.291 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  6 05:11:46 np0005548916 nova_compute[228576]: 2025-12-06 10:11:46.308 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:11:46 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:46 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003630 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:46 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:46 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22100035d0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:46 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:11:46 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:11:46 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:11:46.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:11:46 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  6 05:11:46 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1506613304' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  6 05:11:46 np0005548916 nova_compute[228576]: 2025-12-06 10:11:46.750 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.442s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:11:46 np0005548916 nova_compute[228576]: 2025-12-06 10:11:46.758 228580 DEBUG nova.compute.provider_tree [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Inventory has not changed in ProviderTree for provider: ff2f17cb-ff1d-4da7-9560-4be741380cb1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  6 05:11:46 np0005548916 nova_compute[228576]: 2025-12-06 10:11:46.773 228580 DEBUG nova.scheduler.client.report [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Inventory has not changed for provider ff2f17cb-ff1d-4da7-9560-4be741380cb1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  6 05:11:46 np0005548916 nova_compute[228576]: 2025-12-06 10:11:46.777 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  6 05:11:46 np0005548916 nova_compute[228576]: 2025-12-06 10:11:46.777 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.570s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:11:46 np0005548916 podman[235136]: 2025-12-06 10:11:46.800987616 +0000 UTC m=+0.100362658 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller)
Dec  6 05:11:46 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:11:46 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:11:46 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:11:46.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:11:47 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:47 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_48] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003630 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:48 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:48 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218002df0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:48 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:48 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_50] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280051b0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:48 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:11:48 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:11:48 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:11:48.666 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:11:48 np0005548916 nova_compute[228576]: 2025-12-06 10:11:48.758 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:11:48 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:11:48 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:11:48 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:11:48.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:11:48 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:11:49 np0005548916 nova_compute[228576]: 2025-12-06 10:11:49.464 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:11:49 np0005548916 nova_compute[228576]: 2025-12-06 10:11:49.469 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:11:49 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:49 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22100035d0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:50 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:50 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_48] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003630 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:50 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:50 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218002df0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:50 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:11:50 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:11:50 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:11:50.669 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:11:50 np0005548916 podman[235166]: 2025-12-06 10:11:50.742301127 +0000 UTC m=+0.048812355 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Dec  6 05:11:50 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:11:50 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:11:50 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:11:50.820 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:11:51 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:51 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_50] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280051b0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:52 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:52 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22100035d0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:52 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:52 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22100035d0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:52 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:11:52 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:11:52 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:11:52.672 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:11:52 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:11:52 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:11:52 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:11:52.822 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:11:53 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:53 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_49] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22100035d0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:53 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:11:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:11:54.286 141446 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:11:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:11:54.286 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:11:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:11:54.286 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:11:54 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:54 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_48] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218002df0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:54 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:54 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_50] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003630 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:54 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:11:54 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.002000050s ======
Dec  6 05:11:54 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:11:54.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000050s
Dec  6 05:11:54 np0005548916 podman[235212]: 2025-12-06 10:11:54.760140388 +0000 UTC m=+0.058979566 container health_status ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec  6 05:11:54 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:11:54 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:11:54 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:11:54.825 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:11:55 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:55 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004190 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:56 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:56 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_49] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22100035d0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:56 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:56 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_48] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218002df0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:56 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:11:56 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:11:56 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:11:56.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:11:56 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:11:56 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:11:56 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:11:56.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:11:57 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:57 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_50] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003630 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:58 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:58 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004190 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:58 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:58 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_49] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22240031c0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:58 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:11:58 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:11:58 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:11:58.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:11:58 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:11:58 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:11:58 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:11:58.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:11:58 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:11:59 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:59 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_51] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218002df0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:00 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:00 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_50] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003630 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:00 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:00 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004190 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:00 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:12:00 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:12:00 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:12:00.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:12:00 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:12:00 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:12:00 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:12:00.835 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:12:01 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:01 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_51] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004190 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:02 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:02 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_51] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218002df0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:02 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:02 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_50] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003630 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:02 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:12:02 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:12:02 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:12:02.689 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:12:02 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:12:02 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:12:02 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:12:02.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:12:03 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:03 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004190 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:03 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:12:04 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:04 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_51] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004190 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:04 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:04 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_49] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218002df0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:04 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:12:04 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:12:04 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:12:04.692 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:12:04 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:12:04 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:12:04 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:12:04.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:12:05 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:05 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_50] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003630 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:06 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:06 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22240031e0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:06 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:06 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_51] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004190 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:06 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:12:06 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:12:06 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:12:06.695 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:12:06 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:12:06 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:12:06 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:12:06.846 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:12:07 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/101207 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  6 05:12:07 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:07 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_49] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218002df0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:08 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:08 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_50] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003630 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:08 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:08 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_50] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22240031e0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:08 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:12:08 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:12:08 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:12:08.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:12:08 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:12:08 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:12:08 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:12:08.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:12:08 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:12:09 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:09 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_51] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004190 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:10 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:10 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_49] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003630 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:10 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:10 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218002df0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:10 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:12:10 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:12:10 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:12:10.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:12:10 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:12:10 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:12:10 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:12:10.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:12:11 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:11 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_50] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22240031e0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:12 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:12 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_51] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004190 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:12 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:12 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_49] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003630 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:12 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:12:12 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:12:12 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:12:12.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:12:12 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:12:12 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:12:12 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:12:12.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:12:13 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:13 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218002df0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:13 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:12:14 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:14 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_50] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22240031e0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:14 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:14 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_51] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004190 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:14 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:12:14 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:12:14 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:12:14.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:12:14 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:12:14 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:12:14 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:12:14.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:12:15 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:15 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_49] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003630 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:16 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:16 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218002df0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:16 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:16 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_50] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22240031e0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:16 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:16 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:12:16 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:12:16 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:12:16 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:12:16.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:12:16 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:12:16 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:12:16 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:12:16.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:12:17 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:17 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_51] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004190 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:17 np0005548916 podman[235268]: 2025-12-06 10:12:17.809294697 +0000 UTC m=+0.107942315 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec  6 05:12:18 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:18 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_49] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003630 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:18 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:18 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218002df0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:18 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:12:18 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:12:18 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:12:18.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:12:18 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:12:18 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:12:18 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:12:18.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:12:18 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:12:19 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:19 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:12:19 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:19 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:12:19 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:19 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_50] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22240031e0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:20 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:20 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_50] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240006180 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:20 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:20 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_49] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003630 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:20 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:12:20 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:12:20 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:12:20.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:12:20 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:12:20 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:12:20 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:12:20.868 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:12:21 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:21 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_51] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003630 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:21 np0005548916 podman[235296]: 2025-12-06 10:12:21.769248589 +0000 UTC m=+0.069701901 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125)
Dec  6 05:12:22 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:22 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_49] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003630 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:22 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:22 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c002690 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:22 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:22 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  6 05:12:22 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:12:22 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:12:22 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:12:22.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:12:22 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:12:22 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:12:22 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:12:22.871 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:12:23 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:23 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_52] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218002df0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:23 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:12:24 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:24 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_49] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218002df0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:24 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:24 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_50] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003630 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:24 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:12:24 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:12:24 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:12:24.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:12:24 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:12:24 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:12:24 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:12:24.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:12:25 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:25 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c002690 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:25 np0005548916 podman[235319]: 2025-12-06 10:12:25.81873187 +0000 UTC m=+0.067841275 container health_status ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec  6 05:12:26 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:26 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_52] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240006180 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:26 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:26 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_49] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218002df0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:26 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:12:26 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:12:26 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:12:26.726 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:12:26 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:12:26 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:12:26 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:12:26.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:12:27 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:27 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_50] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003630 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:27 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/101227 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  6 05:12:28 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:28 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c002690 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:28 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:28 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_52] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240006180 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:28 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:12:28 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:12:28 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:12:28.729 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:12:28 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:12:28 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:12:28 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:12:28.879 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:12:28 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:12:29 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:29 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_49] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218002df0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:30 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:30 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_50] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003630 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:30 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:30 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c002690 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:30 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:12:30 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:12:30 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:12:30.732 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:12:30 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:12:30 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:12:30 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:12:30.881 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:12:31 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:31 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_52] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240006180 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:32 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:32 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_49] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218002df0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:32 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:32 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_50] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c005300 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:32 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:12:32 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:12:32 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:12:32.735 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:12:32 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:12:32 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:12:32 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:12:32.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:12:33 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:33 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c002690 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:33 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:12:34 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:34 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_52] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240006180 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:34 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:34 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_49] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218002df0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:34 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:12:34 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:12:34 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:12:34.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:12:34 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:12:34 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:12:34 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:12:34.887 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:12:35 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:35 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_50] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c005300 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:36 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:36 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c002690 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:36 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:36 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_52] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240006180 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:36 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:12:36 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:12:36 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:12:36.741 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:12:36 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:12:36 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:12:36 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:12:36.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:12:37 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:37 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_49] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218002df0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:38 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:38 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_50] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c005300 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:38 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:38 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c002690 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:38 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:12:38 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:12:38 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:12:38.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:12:38 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:12:38 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:12:38 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:12:38.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:12:38 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:12:39 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:39 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_52] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240006180 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:40 np0005548916 nova_compute[228576]: 2025-12-06 10:12:40.472 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:12:40 np0005548916 nova_compute[228576]: 2025-12-06 10:12:40.474 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  6 05:12:40 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:40 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_49] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218002df0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:40 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:40 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_50] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c005300 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:40 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:12:40 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:12:40 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:12:40.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:12:40 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:12:40 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:12:40 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:12:40.898 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:12:41 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  6 05:12:41 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:12:41 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:12:41 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  6 05:12:41 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:41 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c002690 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:42 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:42 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_52] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240006180 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:42 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:42 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_49] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218002df0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:42 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:12:42 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:12:42 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:12:42.750 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:12:42 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:12:42 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:12:42 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:12:42.902 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:12:43 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:43 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_50] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c005300 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:43 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:12:44 np0005548916 nova_compute[228576]: 2025-12-06 10:12:44.471 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:12:44 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:44 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c002690 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:44 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:44 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_52] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240006180 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:44 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:12:44 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:12:44 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:12:44.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:12:44 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:12:44.897 141446 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=10, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '9a:dc:0d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'b6:0a:c4:b8:be:39'}, ipsec=False) old=SB_Global(nb_cfg=9) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  6 05:12:44 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:12:44.898 141446 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  6 05:12:44 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:12:44 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:12:44 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:12:44.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:12:45 np0005548916 nova_compute[228576]: 2025-12-06 10:12:45.470 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:12:45 np0005548916 nova_compute[228576]: 2025-12-06 10:12:45.494 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:12:45 np0005548916 nova_compute[228576]: 2025-12-06 10:12:45.495 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:12:45 np0005548916 nova_compute[228576]: 2025-12-06 10:12:45.495 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:12:45 np0005548916 nova_compute[228576]: 2025-12-06 10:12:45.495 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  6 05:12:45 np0005548916 nova_compute[228576]: 2025-12-06 10:12:45.495 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:12:45 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:45 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_49] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218002df0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:45 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  6 05:12:45 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2886513219' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  6 05:12:45 np0005548916 nova_compute[228576]: 2025-12-06 10:12:45.943 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.448s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:12:46 np0005548916 nova_compute[228576]: 2025-12-06 10:12:46.113 228580 WARNING nova.virt.libvirt.driver [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  6 05:12:46 np0005548916 nova_compute[228576]: 2025-12-06 10:12:46.115 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5214MB free_disk=59.96738052368164GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  6 05:12:46 np0005548916 nova_compute[228576]: 2025-12-06 10:12:46.115 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:12:46 np0005548916 nova_compute[228576]: 2025-12-06 10:12:46.115 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:12:46 np0005548916 nova_compute[228576]: 2025-12-06 10:12:46.223 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  6 05:12:46 np0005548916 nova_compute[228576]: 2025-12-06 10:12:46.223 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  6 05:12:46 np0005548916 nova_compute[228576]: 2025-12-06 10:12:46.240 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:12:46 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:46 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_50] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c005300 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:46 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:12:46 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:12:46 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:46 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c002690 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:46 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  6 05:12:46 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1462413702' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  6 05:12:46 np0005548916 nova_compute[228576]: 2025-12-06 10:12:46.682 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.442s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:12:46 np0005548916 nova_compute[228576]: 2025-12-06 10:12:46.688 228580 DEBUG nova.compute.provider_tree [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Inventory has not changed in ProviderTree for provider: ff2f17cb-ff1d-4da7-9560-4be741380cb1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  6 05:12:46 np0005548916 nova_compute[228576]: 2025-12-06 10:12:46.701 228580 DEBUG nova.scheduler.client.report [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Inventory has not changed for provider ff2f17cb-ff1d-4da7-9560-4be741380cb1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  6 05:12:46 np0005548916 nova_compute[228576]: 2025-12-06 10:12:46.702 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  6 05:12:46 np0005548916 nova_compute[228576]: 2025-12-06 10:12:46.703 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.587s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:12:46 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:12:46 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:12:46 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:12:46.756 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:12:46 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:12:46 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:12:46 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:12:46.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:12:47 np0005548916 nova_compute[228576]: 2025-12-06 10:12:47.703 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:12:47 np0005548916 nova_compute[228576]: 2025-12-06 10:12:47.730 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:12:47 np0005548916 nova_compute[228576]: 2025-12-06 10:12:47.731 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  6 05:12:47 np0005548916 nova_compute[228576]: 2025-12-06 10:12:47.731 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  6 05:12:47 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:47 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_52] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240006180 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:47 np0005548916 nova_compute[228576]: 2025-12-06 10:12:47.748 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  6 05:12:47 np0005548916 nova_compute[228576]: 2025-12-06 10:12:47.748 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:12:47 np0005548916 nova_compute[228576]: 2025-12-06 10:12:47.749 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:12:47 np0005548916 nova_compute[228576]: 2025-12-06 10:12:47.749 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:12:48 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:48 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_49] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800cee0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:48 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:48 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_50] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c005300 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:48 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:12:48 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:12:48 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:12:48.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:12:48 np0005548916 podman[235527]: 2025-12-06 10:12:48.78361543 +0000 UTC m=+0.087163102 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, config_id=ovn_controller, managed_by=edpm_ansible)
Dec  6 05:12:48 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:12:48 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:12:48 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:12:48 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:12:48.911 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:12:49 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:49 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c002690 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:49 np0005548916 ceph-mon[79770]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #55. Immutable memtables: 0.
Dec  6 05:12:49 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:12:49.821974) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec  6 05:12:49 np0005548916 ceph-mon[79770]: rocksdb: [db/flush_job.cc:856] [default] [JOB 31] Flushing memtable with next log file: 55
Dec  6 05:12:49 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015969822229, "job": 31, "event": "flush_started", "num_memtables": 1, "num_entries": 1421, "num_deletes": 255, "total_data_size": 3384085, "memory_usage": 3439744, "flush_reason": "Manual Compaction"}
Dec  6 05:12:49 np0005548916 ceph-mon[79770]: rocksdb: [db/flush_job.cc:885] [default] [JOB 31] Level-0 flush table #56: started
Dec  6 05:12:49 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015969836937, "cf_name": "default", "job": 31, "event": "table_file_creation", "file_number": 56, "file_size": 2208581, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 28347, "largest_seqno": 29763, "table_properties": {"data_size": 2202622, "index_size": 3222, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1669, "raw_key_size": 12816, "raw_average_key_size": 19, "raw_value_size": 2190437, "raw_average_value_size": 3308, "num_data_blocks": 142, "num_entries": 662, "num_filter_entries": 662, "num_deletions": 255, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765015855, "oldest_key_time": 1765015855, "file_creation_time": 1765015969, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79d3ff52-4bd2-4fdc-8b55-33dd9c53d8e0", "db_session_id": "1TK25AVRA1WQDS4JHM8T", "orig_file_number": 56, "seqno_to_time_mapping": "N/A"}}
Dec  6 05:12:49 np0005548916 ceph-mon[79770]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 31] Flush lasted 15023 microseconds, and 6050 cpu microseconds.
Dec  6 05:12:49 np0005548916 ceph-mon[79770]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  6 05:12:49 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:12:49.837015) [db/flush_job.cc:967] [default] [JOB 31] Level-0 flush table #56: 2208581 bytes OK
Dec  6 05:12:49 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:12:49.837044) [db/memtable_list.cc:519] [default] Level-0 commit table #56 started
Dec  6 05:12:49 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:12:49.838843) [db/memtable_list.cc:722] [default] Level-0 commit table #56: memtable #1 done
Dec  6 05:12:49 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:12:49.838858) EVENT_LOG_v1 {"time_micros": 1765015969838854, "job": 31, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec  6 05:12:49 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:12:49.838877) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec  6 05:12:49 np0005548916 ceph-mon[79770]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 31] Try to delete WAL files size 3377363, prev total WAL file size 3377363, number of live WAL files 2.
Dec  6 05:12:49 np0005548916 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000052.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 05:12:49 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:12:49.839956) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D00353033' seq:72057594037927935, type:22 .. '6C6F676D00373534' seq:0, type:0; will stop at (end)
Dec  6 05:12:49 np0005548916 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 32] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec  6 05:12:49 np0005548916 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 31 Base level 0, inputs: [56(2156KB)], [54(14MB)]
Dec  6 05:12:49 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015969840098, "job": 32, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [56], "files_L6": [54], "score": -1, "input_data_size": 17656957, "oldest_snapshot_seqno": -1}
Dec  6 05:12:49 np0005548916 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 32] Generated table #57: 6030 keys, 17525192 bytes, temperature: kUnknown
Dec  6 05:12:49 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015969932427, "cf_name": "default", "job": 32, "event": "table_file_creation", "file_number": 57, "file_size": 17525192, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 17481301, "index_size": 27717, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 15109, "raw_key_size": 153754, "raw_average_key_size": 25, "raw_value_size": 17368698, "raw_average_value_size": 2880, "num_data_blocks": 1135, "num_entries": 6030, "num_filter_entries": 6030, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765014012, "oldest_key_time": 0, "file_creation_time": 1765015969, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79d3ff52-4bd2-4fdc-8b55-33dd9c53d8e0", "db_session_id": "1TK25AVRA1WQDS4JHM8T", "orig_file_number": 57, "seqno_to_time_mapping": "N/A"}}
Dec  6 05:12:49 np0005548916 ceph-mon[79770]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  6 05:12:49 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:12:49.932719) [db/compaction/compaction_job.cc:1663] [default] [JOB 32] Compacted 1@0 + 1@6 files to L6 => 17525192 bytes
Dec  6 05:12:49 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:12:49.934061) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 191.0 rd, 189.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.1, 14.7 +0.0 blob) out(16.7 +0.0 blob), read-write-amplify(15.9) write-amplify(7.9) OK, records in: 6556, records dropped: 526 output_compression: NoCompression
Dec  6 05:12:49 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:12:49.934133) EVENT_LOG_v1 {"time_micros": 1765015969934124, "job": 32, "event": "compaction_finished", "compaction_time_micros": 92449, "compaction_time_cpu_micros": 44443, "output_level": 6, "num_output_files": 1, "total_output_size": 17525192, "num_input_records": 6556, "num_output_records": 6030, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec  6 05:12:49 np0005548916 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000056.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 05:12:49 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015969934749, "job": 32, "event": "table_file_deletion", "file_number": 56}
Dec  6 05:12:49 np0005548916 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000054.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 05:12:49 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015969937718, "job": 32, "event": "table_file_deletion", "file_number": 54}
Dec  6 05:12:49 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:12:49.839807) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:12:49 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:12:49.937772) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:12:49 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:12:49.937776) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:12:49 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:12:49.937777) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:12:49 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:12:49.937779) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:12:49 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:12:49.937780) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:12:50 np0005548916 nova_compute[228576]: 2025-12-06 10:12:50.471 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:12:50 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:50 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_52] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240006180 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:50 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:50 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_49] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800cee0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:50 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:12:50 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:12:50 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:12:50.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:12:50 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:12:50 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:12:50 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:12:50.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:12:51 np0005548916 nova_compute[228576]: 2025-12-06 10:12:51.464 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:12:51 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:51 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_50] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c005300 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:52 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:52 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c002690 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:52 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:52 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_52] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240006180 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:52 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:12:52 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:12:52 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:12:52.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:12:52 np0005548916 podman[235557]: 2025-12-06 10:12:52.767920642 +0000 UTC m=+0.063876067 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Dec  6 05:12:52 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:12:52.900 141446 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=61eba479-a995-4b31-88b9-8ebfcea9907e, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '10'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  6 05:12:52 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:12:52 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:12:52 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:12:52.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:12:53 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:53 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_49] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800cee0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:53 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:12:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:12:54.287 141446 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:12:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:12:54.288 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:12:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:12:54.288 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:12:54 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:54 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_50] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c005300 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:54 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:54 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2210001230 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:54 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:12:54 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:12:54 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:12:54.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:12:54 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:12:54 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:12:54 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:12:54.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:12:55 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:55 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_54] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240006180 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:56 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:56 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_49] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800cee0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:56 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:56 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_50] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c005300 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:56 np0005548916 podman[235605]: 2025-12-06 10:12:56.746122355 +0000 UTC m=+0.058257169 container health_status ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3)
Dec  6 05:12:56 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:12:56 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:12:56 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:12:56.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:12:56 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:12:56 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:12:56 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:12:56.921 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:12:57 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:57 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2210001230 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:58 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:58 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_54] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240006180 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:58 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:58 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_49] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800cee0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:58 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:12:58 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:12:58 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:12:58.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:12:58 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:12:58 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:12:58 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:12:58 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:12:58.923 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:12:59 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:59 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_50] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c005300 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:13:00 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:13:00 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240006180 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:13:00 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:13:00 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_49] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2210001230 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:13:00 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:13:00 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:13:00 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:13:00.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:13:00 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:13:00 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:13:00 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:13:00.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:13:01 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:13:01 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_50] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2210001230 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:13:02 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:13:02 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_50] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c005300 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:13:02 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:13:02 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240006180 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:13:02 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:13:02 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.003000074s ======
Dec  6 05:13:02 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:13:02.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.003000074s
Dec  6 05:13:02 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:13:02 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:13:02 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:13:02.929 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:13:03 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:13:03 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_54] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240006180 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:13:03 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:13:04 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:13:04 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_54] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240006180 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:13:04 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:13:04 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_54] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c005300 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:13:04 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:13:04 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:13:04 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:13:04.789 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:13:04 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:13:04 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:13:04 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:13:04.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:13:05 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:13:05 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240006180 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:13:06 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:13:06 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_50] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240006180 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:13:06 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:13:06 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_50] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2210001230 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:13:06 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:13:06 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:13:06 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:13:06.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:13:06 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:13:06 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:13:06 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:13:06.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:13:07 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:13:07 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_54] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c005300 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:13:08 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:13:08 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240006180 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:13:08 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:13:08 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_49] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800cee0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:13:08 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:13:08 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:13:08 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:13:08.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:13:08 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:13:08 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:13:08 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:13:08 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:13:08.937 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:13:09 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:13:09 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_50] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2210001230 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:13:10 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:13:10 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_54] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c005300 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:13:10 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:13:10 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22400061a0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:13:10 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:13:10 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:13:10 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:13:10.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:13:10 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:13:10 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:13:10 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:13:10.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:13:11 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:13:11 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_49] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800cee0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:13:12 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:13:12 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_50] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22100042c0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:13:12 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:13:12 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_54] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c005300 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:13:12 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:13:12 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:13:12 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:13:12.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:13:12 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:13:12 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:13:12 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:13:12.942 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:13:13 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:13:13 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22400061c0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:13:13 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:13:14 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:13:14 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_49] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800cee0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:13:14 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:13:14 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_50] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22100042c0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:13:14 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:13:14 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:13:14 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:13:14.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:13:14 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:13:14 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:13:14 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:13:14.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:13:15 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:13:15 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_54] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c005300 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:13:16 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:13:16 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22400061e0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:13:16 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:13:16 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_49] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800cee0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:13:16 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:13:16 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:13:16 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:13:16.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:13:16 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:13:16 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:13:16 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:13:16.948 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:13:17 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:13:17 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_50] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22100042c0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:13:18 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:13:18 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_54] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c005300 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:13:18 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:13:18 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240006200 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:13:18 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:13:18 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:13:18 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:13:18.809 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:13:18 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:13:18 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:13:18 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:13:18 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:13:18.950 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:13:19 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:13:19 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_49] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800cee0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:13:19 np0005548916 podman[235661]: 2025-12-06 10:13:19.803658111 +0000 UTC m=+0.098727923 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec  6 05:13:20 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:13:20 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_50] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22100042c0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:13:20 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:13:20 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_54] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c005300 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:13:20 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:13:20 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:13:20 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:13:20.812 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:13:20 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:13:20 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:13:20 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:13:20.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:13:21 np0005548916 kernel: ganesha.nfsd[234951]: segfault at 50 ip 00007f22f486932e sp 00007f22b1ffa210 error 4 in libntirpc.so.5.8[7f22f484e000+2c000] likely on CPU 1 (core 0, socket 1)
Dec  6 05:13:21 np0005548916 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Dec  6 05:13:21 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:13:21 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240006220 fd 28 proxy ignored for local
Dec  6 05:13:21 np0005548916 systemd[1]: Started Process Core Dump (PID 235689/UID 0).
Dec  6 05:13:22 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:13:22 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:13:22 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:13:22.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:13:22 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:13:22 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:13:22 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:13:22.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:13:23 np0005548916 podman[235692]: 2025-12-06 10:13:23.775532537 +0000 UTC m=+0.073874157 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Dec  6 05:13:23 np0005548916 systemd-coredump[235690]: Process 229485 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 94:#012#0  0x00007f22f486932e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Dec  6 05:13:23 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:13:23 np0005548916 systemd[1]: systemd-coredump@8-235689-0.service: Deactivated successfully.
Dec  6 05:13:23 np0005548916 systemd[1]: systemd-coredump@8-235689-0.service: Consumed 2.060s CPU time.
Dec  6 05:13:24 np0005548916 podman[235713]: 2025-12-06 10:13:24.024022732 +0000 UTC m=+0.030967726 container died cfd84277d1dcac04a876f3b0ccbf223dd9196bdf0059805be5855adee48962d9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, ceph=True, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  6 05:13:24 np0005548916 systemd[1]: var-lib-containers-storage-overlay-ce8c7da4624aa519272ef2c8bd30d12c947da67ff2923b4958fe16726ed31e84-merged.mount: Deactivated successfully.
Dec  6 05:13:24 np0005548916 podman[235713]: 2025-12-06 10:13:24.065342094 +0000 UTC m=+0.072287048 container remove cfd84277d1dcac04a876f3b0ccbf223dd9196bdf0059805be5855adee48962d9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  6 05:13:24 np0005548916 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.0.0.compute-1.djsnbu.service: Main process exited, code=exited, status=139/n/a
Dec  6 05:13:24 np0005548916 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.0.0.compute-1.djsnbu.service: Failed with result 'exit-code'.
Dec  6 05:13:24 np0005548916 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.0.0.compute-1.djsnbu.service: Consumed 3.064s CPU time.
Dec  6 05:13:24 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:13:24 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:13:24 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:13:24.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:13:24 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:13:24 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:13:24 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:13:24.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:13:26 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:13:26 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:13:26 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:13:26.820 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:13:26 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:13:26 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:13:26 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:13:26.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:13:27 np0005548916 podman[235759]: 2025-12-06 10:13:27.767475591 +0000 UTC m=+0.064947487 container health_status ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec  6 05:13:28 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/101328 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  6 05:13:28 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:13:28 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:13:28 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:13:28.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:13:28 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:13:28 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:13:28 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:13:28 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:13:28.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:13:30 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:13:30 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:13:30 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:13:30.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:13:30 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:13:30 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:13:30 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:13:30.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:13:32 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:13:32 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:13:32 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:13:32.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:13:32 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:13:32 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:13:32 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:13:32.965 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:13:33 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/101333 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  6 05:13:33 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:13:34 np0005548916 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.0.0.compute-1.djsnbu.service: Scheduled restart job, restart counter is at 9.
Dec  6 05:13:34 np0005548916 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.djsnbu for 5ecd3f74-dade-5fc4-92ce-8950ae424258.
Dec  6 05:13:34 np0005548916 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.0.0.compute-1.djsnbu.service: Consumed 3.064s CPU time.
Dec  6 05:13:34 np0005548916 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.djsnbu for 5ecd3f74-dade-5fc4-92ce-8950ae424258...
Dec  6 05:13:34 np0005548916 podman[235856]: 2025-12-06 10:13:34.664906951 +0000 UTC m=+0.048159391 container create 2d93c4a34df0fb0a855605f5ca927eca7d3f452dbc047710bdbb64fd976c80b1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  6 05:13:34 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f75aa19a07c84791431fa0a6498e04afd32b24ddc88ad13037871505e8e2ffaf/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Dec  6 05:13:34 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f75aa19a07c84791431fa0a6498e04afd32b24ddc88ad13037871505e8e2ffaf/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  6 05:13:34 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f75aa19a07c84791431fa0a6498e04afd32b24ddc88ad13037871505e8e2ffaf/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Dec  6 05:13:34 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f75aa19a07c84791431fa0a6498e04afd32b24ddc88ad13037871505e8e2ffaf/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.djsnbu-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Dec  6 05:13:34 np0005548916 podman[235856]: 2025-12-06 10:13:34.736701877 +0000 UTC m=+0.119954327 container init 2d93c4a34df0fb0a855605f5ca927eca7d3f452dbc047710bdbb64fd976c80b1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  6 05:13:34 np0005548916 podman[235856]: 2025-12-06 10:13:34.644761353 +0000 UTC m=+0.028013843 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  6 05:13:34 np0005548916 podman[235856]: 2025-12-06 10:13:34.74168192 +0000 UTC m=+0.124934360 container start 2d93c4a34df0fb0a855605f5ca927eca7d3f452dbc047710bdbb64fd976c80b1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2)
Dec  6 05:13:34 np0005548916 bash[235856]: 2d93c4a34df0fb0a855605f5ca927eca7d3f452dbc047710bdbb64fd976c80b1
Dec  6 05:13:34 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:34 : epoch 693401ce : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Dec  6 05:13:34 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:34 : epoch 693401ce : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Dec  6 05:13:34 np0005548916 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.djsnbu for 5ecd3f74-dade-5fc4-92ce-8950ae424258.
Dec  6 05:13:34 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:34 : epoch 693401ce : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Dec  6 05:13:34 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:34 : epoch 693401ce : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Dec  6 05:13:34 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:34 : epoch 693401ce : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Dec  6 05:13:34 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:34 : epoch 693401ce : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Dec  6 05:13:34 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:34 : epoch 693401ce : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Dec  6 05:13:34 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:34 : epoch 693401ce : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:13:34 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:13:34 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:13:34 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:13:34.832 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:13:34 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:13:34 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:13:34 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:13:34.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:13:36 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:13:36 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:13:36 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:13:36.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:13:36 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:13:36 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:13:36 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:13:36.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:13:38 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:13:38 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:13:38 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:13:38.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:13:38 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:13:38 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:13:38 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:13:38 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:13:38.973 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:13:40 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:13:40 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:13:40 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:13:40.841 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:13:40 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:40 : epoch 693401ce : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:13:40 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:40 : epoch 693401ce : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:13:40 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:13:40 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:13:40 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:13:40.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:13:42 np0005548916 nova_compute[228576]: 2025-12-06 10:13:42.471 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:13:42 np0005548916 nova_compute[228576]: 2025-12-06 10:13:42.471 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  6 05:13:42 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:13:42 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:13:42 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:13:42.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:13:42 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:13:42 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:13:42 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:13:42.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:13:43 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:13:44 np0005548916 nova_compute[228576]: 2025-12-06 10:13:44.471 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:13:44 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:13:44 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:13:44 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:13:44.847 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:13:44 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:13:44 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:13:44 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:13:44.982 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:13:46 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:13:46.013 141446 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=11, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '9a:dc:0d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'b6:0a:c4:b8:be:39'}, ipsec=False) old=SB_Global(nb_cfg=10) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  6 05:13:46 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:13:46.015 141446 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  6 05:13:46 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:13:46 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:13:46 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:13:46.851 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:13:46 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:46 : epoch 693401ce : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  6 05:13:46 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:46 : epoch 693401ce : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Dec  6 05:13:46 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:46 : epoch 693401ce : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Dec  6 05:13:46 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:46 : epoch 693401ce : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Dec  6 05:13:46 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:46 : epoch 693401ce : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Dec  6 05:13:46 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:46 : epoch 693401ce : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Dec  6 05:13:46 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:46 : epoch 693401ce : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Dec  6 05:13:46 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:46 : epoch 693401ce : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  6 05:13:46 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:46 : epoch 693401ce : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  6 05:13:46 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:46 : epoch 693401ce : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  6 05:13:46 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:46 : epoch 693401ce : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Dec  6 05:13:46 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:46 : epoch 693401ce : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  6 05:13:46 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:46 : epoch 693401ce : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Dec  6 05:13:46 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:46 : epoch 693401ce : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Dec  6 05:13:46 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:46 : epoch 693401ce : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Dec  6 05:13:46 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:46 : epoch 693401ce : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Dec  6 05:13:46 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:46 : epoch 693401ce : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Dec  6 05:13:46 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:46 : epoch 693401ce : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Dec  6 05:13:46 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:46 : epoch 693401ce : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Dec  6 05:13:46 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:46 : epoch 693401ce : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Dec  6 05:13:46 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:46 : epoch 693401ce : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Dec  6 05:13:46 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:46 : epoch 693401ce : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Dec  6 05:13:46 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:46 : epoch 693401ce : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Dec  6 05:13:46 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:46 : epoch 693401ce : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Dec  6 05:13:46 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:46 : epoch 693401ce : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec  6 05:13:46 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:46 : epoch 693401ce : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Dec  6 05:13:46 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:46 : epoch 693401ce : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec  6 05:13:46 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:46 : epoch 693401ce : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:13:46 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:13:46 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:13:46 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:13:46.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:13:47 np0005548916 nova_compute[228576]: 2025-12-06 10:13:47.470 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:13:47 np0005548916 nova_compute[228576]: 2025-12-06 10:13:47.470 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:13:47 np0005548916 nova_compute[228576]: 2025-12-06 10:13:47.471 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:13:47 np0005548916 nova_compute[228576]: 2025-12-06 10:13:47.490 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:13:47 np0005548916 nova_compute[228576]: 2025-12-06 10:13:47.491 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:13:47 np0005548916 nova_compute[228576]: 2025-12-06 10:13:47.491 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:13:47 np0005548916 nova_compute[228576]: 2025-12-06 10:13:47.491 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  6 05:13:47 np0005548916 nova_compute[228576]: 2025-12-06 10:13:47.491 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:13:47 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:47 : epoch 693401ce : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1bd4000df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:13:47 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:13:47 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:13:47 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:13:47 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:13:47 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:13:47 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:13:47 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  6 05:13:47 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:13:47 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:13:47 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  6 05:13:47 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  6 05:13:47 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4031114062' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  6 05:13:47 np0005548916 nova_compute[228576]: 2025-12-06 10:13:47.967 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.476s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:13:48 np0005548916 nova_compute[228576]: 2025-12-06 10:13:48.157 228580 WARNING nova.virt.libvirt.driver [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  6 05:13:48 np0005548916 nova_compute[228576]: 2025-12-06 10:13:48.158 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5219MB free_disk=59.98813247680664GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  6 05:13:48 np0005548916 nova_compute[228576]: 2025-12-06 10:13:48.159 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:13:48 np0005548916 nova_compute[228576]: 2025-12-06 10:13:48.159 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:13:48 np0005548916 nova_compute[228576]: 2025-12-06 10:13:48.224 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  6 05:13:48 np0005548916 nova_compute[228576]: 2025-12-06 10:13:48.225 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  6 05:13:48 np0005548916 nova_compute[228576]: 2025-12-06 10:13:48.238 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:13:48 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:48 : epoch 693401ce : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1bc8001c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:13:48 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:48 : epoch 693401ce : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1bb0000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:13:48 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  6 05:13:48 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2281509267' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  6 05:13:48 np0005548916 nova_compute[228576]: 2025-12-06 10:13:48.700 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.462s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:13:48 np0005548916 nova_compute[228576]: 2025-12-06 10:13:48.708 228580 DEBUG nova.compute.provider_tree [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Inventory has not changed in ProviderTree for provider: ff2f17cb-ff1d-4da7-9560-4be741380cb1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  6 05:13:48 np0005548916 nova_compute[228576]: 2025-12-06 10:13:48.728 228580 DEBUG nova.scheduler.client.report [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Inventory has not changed for provider ff2f17cb-ff1d-4da7-9560-4be741380cb1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  6 05:13:48 np0005548916 nova_compute[228576]: 2025-12-06 10:13:48.731 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  6 05:13:48 np0005548916 nova_compute[228576]: 2025-12-06 10:13:48.731 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.572s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:13:48 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:13:48 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:13:48 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:13:48.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:13:48 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:13:48 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:13:48 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:13:48 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:13:48.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:13:49 np0005548916 nova_compute[228576]: 2025-12-06 10:13:49.732 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:13:49 np0005548916 nova_compute[228576]: 2025-12-06 10:13:49.733 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  6 05:13:49 np0005548916 nova_compute[228576]: 2025-12-06 10:13:49.733 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  6 05:13:49 np0005548916 nova_compute[228576]: 2025-12-06 10:13:49.752 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  6 05:13:49 np0005548916 nova_compute[228576]: 2025-12-06 10:13:49.752 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:13:49 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:49 : epoch 693401ce : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1bac000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:13:49 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:49 : epoch 693401ce : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:13:49 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:49 : epoch 693401ce : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:13:50 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/101350 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  6 05:13:50 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:50 : epoch 693401ce : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1bb8000fa0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:13:50 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:50 : epoch 693401ce : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1bc8001c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:13:50 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:13:50 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:13:50 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:13:50.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:13:50 np0005548916 podman[236131]: 2025-12-06 10:13:50.858793276 +0000 UTC m=+0.155364343 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec  6 05:13:50 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:13:50 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:13:50 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:13:50.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:13:51 np0005548916 nova_compute[228576]: 2025-12-06 10:13:51.471 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:13:51 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:51 : epoch 693401ce : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1bb00016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:13:52 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:52 : epoch 693401ce : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1bac0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:13:52 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:52 : epoch 693401ce : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1bb8001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:13:52 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:13:52 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:13:52 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:13:52.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:13:52 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:52 : epoch 693401ce : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  6 05:13:52 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:13:52 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:13:52 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:13:52 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:13:52 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:13:52.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:13:53 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:13:53.017 141446 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=61eba479-a995-4b31-88b9-8ebfcea9907e, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '11'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  6 05:13:53 np0005548916 nova_compute[228576]: 2025-12-06 10:13:53.464 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:13:53 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:53 : epoch 693401ce : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1bc8001c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:13:53 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:13:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:13:54.288 141446 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:13:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:13:54.289 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:13:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:13:54.289 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:13:54 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:54 : epoch 693401ce : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1bb00016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:13:54 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:54 : epoch 693401ce : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1bac0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:13:54 np0005548916 podman[236209]: 2025-12-06 10:13:54.750428669 +0000 UTC m=+0.058724393 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, tcib_managed=true)
Dec  6 05:13:54 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:13:54 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:13:54 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:13:54.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:13:54 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:13:54 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:13:54 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:13:54.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:13:55 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:55 : epoch 693401ce : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1bb8001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:13:55 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/101355 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  6 05:13:56 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:56 : epoch 693401ce : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1bc8001c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:13:56 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:56 : epoch 693401ce : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1bb00016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:13:56 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:13:56 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:13:56 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:13:56.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:13:57 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:13:57 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:13:57 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:13:56.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:13:57 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:57 : epoch 693401ce : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1bac0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:13:58 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:58 : epoch 693401ce : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1bb8001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:13:58 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:58 : epoch 693401ce : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1bb8001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:13:58 np0005548916 podman[236229]: 2025-12-06 10:13:58.828287647 +0000 UTC m=+0.084615833 container health_status ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125)
Dec  6 05:13:58 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:13:58 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:13:58 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:13:58.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:13:58 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:13:59 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:13:59 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:13:59 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:13:59.001 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:13:59 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:59 : epoch 693401ce : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1bb0002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:00 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:14:00 : epoch 693401ce : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1bac002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:00 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:14:00 : epoch 693401ce : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1bb8001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:00 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:14:00 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:14:00 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:14:00.870 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:14:01 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:14:01 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:14:01 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:14:01.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:14:01 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:14:01 : epoch 693401ce : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1bb8001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:02 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:14:02 : epoch 693401ce : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1bb0002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:02 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:14:02 : epoch 693401ce : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1bac002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:02 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:14:02 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:14:02 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:14:02.873 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:14:03 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:14:03 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:14:03 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:14:03.006 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:14:03 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:14:03 : epoch 693401ce : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1bb8001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:03 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:14:04 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:14:04 : epoch 693401ce : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1bc80030f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:04 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:14:04 : epoch 693401ce : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1bb0002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:04 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:14:04 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:14:04 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:14:04.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:14:05 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:14:05 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:14:05 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:14:05.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:14:05 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:14:05 : epoch 693401ce : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1bac002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:06 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:14:06 : epoch 693401ce : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1bb8003b20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:06 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:14:06 : epoch 693401ce : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1bc80030f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:06 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:14:06 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:14:06 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:14:06.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:14:07 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:14:07 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:14:07 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:14:07.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:14:07 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:14:07 : epoch 693401ce : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1bc80030f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:08 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:14:08 : epoch 693401ce : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1bac003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:08 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:14:08 : epoch 693401ce : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1bb8003b20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:08 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:14:08 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:14:08 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:14:08.883 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:14:08 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:14:09 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:14:09 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:14:09 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:14:09.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:14:09 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:14:09 : epoch 693401ce : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1bb0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:10 np0005548916 kernel: ganesha.nfsd[236054]: segfault at 50 ip 00007f1c8425e32e sp 00007f1c527fb210 error 4 in libntirpc.so.5.8[7f1c84243000+2c000] likely on CPU 0 (core 0, socket 0)
Dec  6 05:14:10 np0005548916 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Dec  6 05:14:10 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:14:10 : epoch 693401ce : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1bc80030f0 fd 39 proxy ignored for local
Dec  6 05:14:10 np0005548916 systemd[1]: Started Process Core Dump (PID 236254/UID 0).
Dec  6 05:14:10 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:14:10 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:14:10 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:14:10.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:14:11 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:14:11 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:14:11 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:14:11.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:14:11 np0005548916 systemd-coredump[236255]: Process 235875 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 43:#012#0  0x00007f1c8425e32e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Dec  6 05:14:12 np0005548916 systemd[1]: systemd-coredump@9-236254-0.service: Deactivated successfully.
Dec  6 05:14:12 np0005548916 systemd[1]: systemd-coredump@9-236254-0.service: Consumed 1.390s CPU time.
Dec  6 05:14:12 np0005548916 podman[236260]: 2025-12-06 10:14:12.106221375 +0000 UTC m=+0.027942192 container died 2d93c4a34df0fb0a855605f5ca927eca7d3f452dbc047710bdbb64fd976c80b1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  6 05:14:12 np0005548916 systemd[1]: var-lib-containers-storage-overlay-f75aa19a07c84791431fa0a6498e04afd32b24ddc88ad13037871505e8e2ffaf-merged.mount: Deactivated successfully.
Dec  6 05:14:12 np0005548916 podman[236260]: 2025-12-06 10:14:12.150025588 +0000 UTC m=+0.071746365 container remove 2d93c4a34df0fb0a855605f5ca927eca7d3f452dbc047710bdbb64fd976c80b1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, ceph=True, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid)
Dec  6 05:14:12 np0005548916 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.0.0.compute-1.djsnbu.service: Main process exited, code=exited, status=139/n/a
Dec  6 05:14:12 np0005548916 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.0.0.compute-1.djsnbu.service: Failed with result 'exit-code'.
Dec  6 05:14:12 np0005548916 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.0.0.compute-1.djsnbu.service: Consumed 1.743s CPU time.
Dec  6 05:14:12 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:14:12 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:14:12 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:14:12.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:14:13 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:14:13 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:14:13 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:14:13.019 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:14:13 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:14:14 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:14:14 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:14:14 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:14:14.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:14:15 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:14:15 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:14:15 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:14:15.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:14:16 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/101416 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  6 05:14:16 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:14:16 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:14:16 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:14:16.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:14:17 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:14:17 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:14:17 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:14:17.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:14:18 np0005548916 ceph-mon[79770]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #58. Immutable memtables: 0.
Dec  6 05:14:18 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:14:18.558382) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec  6 05:14:18 np0005548916 ceph-mon[79770]: rocksdb: [db/flush_job.cc:856] [default] [JOB 33] Flushing memtable with next log file: 58
Dec  6 05:14:18 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016058558663, "job": 33, "event": "flush_started", "num_memtables": 1, "num_entries": 1434, "num_deletes": 503, "total_data_size": 2830874, "memory_usage": 2884920, "flush_reason": "Manual Compaction"}
Dec  6 05:14:18 np0005548916 ceph-mon[79770]: rocksdb: [db/flush_job.cc:885] [default] [JOB 33] Level-0 flush table #59: started
Dec  6 05:14:18 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016058574492, "cf_name": "default", "job": 33, "event": "table_file_creation", "file_number": 59, "file_size": 1843544, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 29768, "largest_seqno": 31197, "table_properties": {"data_size": 1837641, "index_size": 2723, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2117, "raw_key_size": 16377, "raw_average_key_size": 19, "raw_value_size": 1823788, "raw_average_value_size": 2202, "num_data_blocks": 117, "num_entries": 828, "num_filter_entries": 828, "num_deletions": 503, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765015970, "oldest_key_time": 1765015970, "file_creation_time": 1765016058, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79d3ff52-4bd2-4fdc-8b55-33dd9c53d8e0", "db_session_id": "1TK25AVRA1WQDS4JHM8T", "orig_file_number": 59, "seqno_to_time_mapping": "N/A"}}
Dec  6 05:14:18 np0005548916 ceph-mon[79770]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 33] Flush lasted 16096 microseconds, and 7701 cpu microseconds.
Dec  6 05:14:18 np0005548916 ceph-mon[79770]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  6 05:14:18 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:14:18.574556) [db/flush_job.cc:967] [default] [JOB 33] Level-0 flush table #59: 1843544 bytes OK
Dec  6 05:14:18 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:14:18.574585) [db/memtable_list.cc:519] [default] Level-0 commit table #59 started
Dec  6 05:14:18 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:14:18.576471) [db/memtable_list.cc:722] [default] Level-0 commit table #59: memtable #1 done
Dec  6 05:14:18 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:14:18.576485) EVENT_LOG_v1 {"time_micros": 1765016058576480, "job": 33, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec  6 05:14:18 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:14:18.576502) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec  6 05:14:18 np0005548916 ceph-mon[79770]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 33] Try to delete WAL files size 2823129, prev total WAL file size 2823129, number of live WAL files 2.
Dec  6 05:14:18 np0005548916 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000055.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 05:14:18 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:14:18.577285) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730032323539' seq:72057594037927935, type:22 .. '7061786F730032353131' seq:0, type:0; will stop at (end)
Dec  6 05:14:18 np0005548916 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 34] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec  6 05:14:18 np0005548916 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 33 Base level 0, inputs: [59(1800KB)], [57(16MB)]
Dec  6 05:14:18 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016058577371, "job": 34, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [59], "files_L6": [57], "score": -1, "input_data_size": 19368736, "oldest_snapshot_seqno": -1}
Dec  6 05:14:18 np0005548916 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 34] Generated table #60: 5833 keys, 13151442 bytes, temperature: kUnknown
Dec  6 05:14:18 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016058638946, "cf_name": "default", "job": 34, "event": "table_file_creation", "file_number": 60, "file_size": 13151442, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13113936, "index_size": 21844, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 14597, "raw_key_size": 150788, "raw_average_key_size": 25, "raw_value_size": 13009737, "raw_average_value_size": 2230, "num_data_blocks": 875, "num_entries": 5833, "num_filter_entries": 5833, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765014012, "oldest_key_time": 0, "file_creation_time": 1765016058, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79d3ff52-4bd2-4fdc-8b55-33dd9c53d8e0", "db_session_id": "1TK25AVRA1WQDS4JHM8T", "orig_file_number": 60, "seqno_to_time_mapping": "N/A"}}
Dec  6 05:14:18 np0005548916 ceph-mon[79770]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  6 05:14:18 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:14:18.639279) [db/compaction/compaction_job.cc:1663] [default] [JOB 34] Compacted 1@0 + 1@6 files to L6 => 13151442 bytes
Dec  6 05:14:18 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:14:18.641865) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 314.1 rd, 213.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.8, 16.7 +0.0 blob) out(12.5 +0.0 blob), read-write-amplify(17.6) write-amplify(7.1) OK, records in: 6858, records dropped: 1025 output_compression: NoCompression
Dec  6 05:14:18 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:14:18.641922) EVENT_LOG_v1 {"time_micros": 1765016058641900, "job": 34, "event": "compaction_finished", "compaction_time_micros": 61669, "compaction_time_cpu_micros": 29630, "output_level": 6, "num_output_files": 1, "total_output_size": 13151442, "num_input_records": 6858, "num_output_records": 5833, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec  6 05:14:18 np0005548916 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000059.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 05:14:18 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016058642683, "job": 34, "event": "table_file_deletion", "file_number": 59}
Dec  6 05:14:18 np0005548916 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000057.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 05:14:18 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016058647000, "job": 34, "event": "table_file_deletion", "file_number": 57}
Dec  6 05:14:18 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:14:18.577230) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:14:18 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:14:18.647045) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:14:18 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:14:18.647052) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:14:18 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:14:18.647054) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:14:18 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:14:18.647055) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:14:18 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:14:18.647057) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:14:18 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:14:18 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:14:18 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:14:18.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:14:18 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:14:19 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:14:19 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:14:19 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:14:19.026 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:14:20 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:14:20 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:14:20 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:14:20.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:14:21 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:14:21 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:14:21 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:14:21.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:14:21 np0005548916 podman[236333]: 2025-12-06 10:14:21.813129879 +0000 UTC m=+0.115084217 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Dec  6 05:14:22 np0005548916 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.0.0.compute-1.djsnbu.service: Scheduled restart job, restart counter is at 10.
Dec  6 05:14:22 np0005548916 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.djsnbu for 5ecd3f74-dade-5fc4-92ce-8950ae424258.
Dec  6 05:14:22 np0005548916 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.0.0.compute-1.djsnbu.service: Consumed 1.743s CPU time.
Dec  6 05:14:22 np0005548916 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.djsnbu for 5ecd3f74-dade-5fc4-92ce-8950ae424258...
Dec  6 05:14:22 np0005548916 podman[236410]: 2025-12-06 10:14:22.644998538 +0000 UTC m=+0.052322835 container create e1672b0d6c65fac4dad8abe557390306766af98aa8142ef347d33cd29910d02b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  6 05:14:22 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8d8a693bbb4eb50dbf44c219f4881afeb798ed99392b659e6bc95aa1a478f7c4/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Dec  6 05:14:22 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8d8a693bbb4eb50dbf44c219f4881afeb798ed99392b659e6bc95aa1a478f7c4/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  6 05:14:22 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8d8a693bbb4eb50dbf44c219f4881afeb798ed99392b659e6bc95aa1a478f7c4/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Dec  6 05:14:22 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8d8a693bbb4eb50dbf44c219f4881afeb798ed99392b659e6bc95aa1a478f7c4/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.djsnbu-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Dec  6 05:14:22 np0005548916 podman[236410]: 2025-12-06 10:14:22.702691755 +0000 UTC m=+0.110016072 container init e1672b0d6c65fac4dad8abe557390306766af98aa8142ef347d33cd29910d02b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, ceph=True)
Dec  6 05:14:22 np0005548916 podman[236410]: 2025-12-06 10:14:22.710363954 +0000 UTC m=+0.117688251 container start e1672b0d6c65fac4dad8abe557390306766af98aa8142ef347d33cd29910d02b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.license=GPLv2, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  6 05:14:22 np0005548916 bash[236410]: e1672b0d6c65fac4dad8abe557390306766af98aa8142ef347d33cd29910d02b
Dec  6 05:14:22 np0005548916 podman[236410]: 2025-12-06 10:14:22.625744772 +0000 UTC m=+0.033069119 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  6 05:14:22 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:22 : epoch 693401fe : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Dec  6 05:14:22 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:22 : epoch 693401fe : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Dec  6 05:14:22 np0005548916 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.djsnbu for 5ecd3f74-dade-5fc4-92ce-8950ae424258.
Dec  6 05:14:22 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:22 : epoch 693401fe : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Dec  6 05:14:22 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:22 : epoch 693401fe : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Dec  6 05:14:22 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:22 : epoch 693401fe : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Dec  6 05:14:22 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:22 : epoch 693401fe : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Dec  6 05:14:22 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:22 : epoch 693401fe : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Dec  6 05:14:22 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:22 : epoch 693401fe : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:14:22 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:14:22 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:14:22 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:14:22.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:14:23 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:14:23 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:14:23 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:14:23.031 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:14:23 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:14:24 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:14:24 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:14:24 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:14:24.906 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:14:25 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:14:25 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:14:25 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:14:25.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:14:25 np0005548916 podman[236470]: 2025-12-06 10:14:25.763135263 +0000 UTC m=+0.069278973 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent)
Dec  6 05:14:26 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:14:26 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:14:26 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:14:26.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:14:27 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:14:27 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:14:27 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:14:27.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:14:28 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:28 : epoch 693401fe : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:14:28 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:28 : epoch 693401fe : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:14:28 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:14:28 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:14:28 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:14:28.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:14:28 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:14:29 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:14:29 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:14:29 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:14:29.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:14:29 np0005548916 podman[236491]: 2025-12-06 10:14:29.814924406 +0000 UTC m=+0.098417765 container health_status ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.3)
Dec  6 05:14:30 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:14:30 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:14:30 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:14:30.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:14:31 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:14:31 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:14:31 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:14:31.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:14:32 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:14:32 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:14:32 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:14:32.917 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:14:33 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:14:33 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:14:33 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:14:33.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:14:33 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:14:34 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:34 : epoch 693401fe : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  6 05:14:34 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:34 : epoch 693401fe : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Dec  6 05:14:34 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:34 : epoch 693401fe : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Dec  6 05:14:34 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:34 : epoch 693401fe : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Dec  6 05:14:34 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:34 : epoch 693401fe : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Dec  6 05:14:34 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:34 : epoch 693401fe : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Dec  6 05:14:34 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:34 : epoch 693401fe : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Dec  6 05:14:34 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:34 : epoch 693401fe : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  6 05:14:34 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:34 : epoch 693401fe : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  6 05:14:34 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:34 : epoch 693401fe : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  6 05:14:34 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:34 : epoch 693401fe : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Dec  6 05:14:34 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:34 : epoch 693401fe : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  6 05:14:34 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:34 : epoch 693401fe : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Dec  6 05:14:34 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:34 : epoch 693401fe : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Dec  6 05:14:34 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:34 : epoch 693401fe : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Dec  6 05:14:34 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:34 : epoch 693401fe : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Dec  6 05:14:34 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:34 : epoch 693401fe : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Dec  6 05:14:34 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:34 : epoch 693401fe : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Dec  6 05:14:34 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:34 : epoch 693401fe : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Dec  6 05:14:34 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:34 : epoch 693401fe : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Dec  6 05:14:34 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:34 : epoch 693401fe : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Dec  6 05:14:34 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:34 : epoch 693401fe : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Dec  6 05:14:34 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:34 : epoch 693401fe : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Dec  6 05:14:34 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:34 : epoch 693401fe : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Dec  6 05:14:34 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:34 : epoch 693401fe : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec  6 05:14:34 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:34 : epoch 693401fe : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Dec  6 05:14:34 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:34 : epoch 693401fe : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec  6 05:14:34 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:14:34 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:14:34 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:14:34.921 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:14:35 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:14:35 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:14:35 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:14:35.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:14:35 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:35 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e88000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:36 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:36 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e780016e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:36 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:36 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e60000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:36 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:14:36 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:14:36 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:14:36.923 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:14:37 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:14:37 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:14:37 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:14:37.048 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:14:37 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:37 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e6c000fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:38 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/101438 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  6 05:14:38 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:38 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e64000d00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:38 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:38 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e78002000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:38 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:14:38 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:14:38 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:14:38 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:14:38.927 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:14:39 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:14:39 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:14:39 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:14:39.051 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:14:39 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:39 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e600016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:40 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:40 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e6c001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:40 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:40 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e6c001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:40 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:14:40 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:14:40 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:14:40.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:14:41 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:14:41 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:14:41 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:14:41.054 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:14:41 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:41 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e78002000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:42 np0005548916 nova_compute[228576]: 2025-12-06 10:14:42.470 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:14:42 np0005548916 nova_compute[228576]: 2025-12-06 10:14:42.471 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  6 05:14:42 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:42 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e78002000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:42 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:42 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e6c001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:42 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:14:42 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:14:42 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:14:42.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:14:43 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:14:43 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:14:43 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:14:43.056 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:14:43 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:43 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e60001fc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:43 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:14:44 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:44 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e78002000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:44 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:44 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e58000d20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:44 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:14:44 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:14:44 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:14:44.937 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:14:45 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:14:45 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:14:45 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:14:45.059 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:14:45 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:45 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e6c001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:46 np0005548916 nova_compute[228576]: 2025-12-06 10:14:46.471 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:14:46 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:46 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e60002140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:46 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:46 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e78002000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:46 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:14:46 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:14:46 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:14:46.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:14:47 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:14:47 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:14:47 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:14:47.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:14:47 np0005548916 nova_compute[228576]: 2025-12-06 10:14:47.470 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:14:47 np0005548916 nova_compute[228576]: 2025-12-06 10:14:47.471 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:14:47 np0005548916 nova_compute[228576]: 2025-12-06 10:14:47.496 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:14:47 np0005548916 nova_compute[228576]: 2025-12-06 10:14:47.496 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:14:47 np0005548916 nova_compute[228576]: 2025-12-06 10:14:47.496 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:14:47 np0005548916 nova_compute[228576]: 2025-12-06 10:14:47.497 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  6 05:14:47 np0005548916 nova_compute[228576]: 2025-12-06 10:14:47.497 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:14:47 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:47 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e58001840 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:47 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  6 05:14:47 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/97620197' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  6 05:14:48 np0005548916 nova_compute[228576]: 2025-12-06 10:14:48.010 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.513s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:14:48 np0005548916 nova_compute[228576]: 2025-12-06 10:14:48.172 228580 WARNING nova.virt.libvirt.driver [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  6 05:14:48 np0005548916 nova_compute[228576]: 2025-12-06 10:14:48.173 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5189MB free_disk=59.89735412597656GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  6 05:14:48 np0005548916 nova_compute[228576]: 2025-12-06 10:14:48.174 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:14:48 np0005548916 nova_compute[228576]: 2025-12-06 10:14:48.174 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:14:48 np0005548916 nova_compute[228576]: 2025-12-06 10:14:48.275 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  6 05:14:48 np0005548916 nova_compute[228576]: 2025-12-06 10:14:48.275 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  6 05:14:48 np0005548916 nova_compute[228576]: 2025-12-06 10:14:48.295 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:14:48 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:48 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e6c001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:48 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  6 05:14:48 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3271139588' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  6 05:14:48 np0005548916 nova_compute[228576]: 2025-12-06 10:14:48.722 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.427s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:14:48 np0005548916 nova_compute[228576]: 2025-12-06 10:14:48.728 228580 DEBUG nova.compute.provider_tree [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Inventory has not changed in ProviderTree for provider: ff2f17cb-ff1d-4da7-9560-4be741380cb1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  6 05:14:48 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:48 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e60002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:48 np0005548916 nova_compute[228576]: 2025-12-06 10:14:48.748 228580 DEBUG nova.scheduler.client.report [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Inventory has not changed for provider ff2f17cb-ff1d-4da7-9560-4be741380cb1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  6 05:14:48 np0005548916 nova_compute[228576]: 2025-12-06 10:14:48.750 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  6 05:14:48 np0005548916 nova_compute[228576]: 2025-12-06 10:14:48.750 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.576s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:14:48 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:14:48 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:14:48 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:14:48 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:14:48.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:14:49 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:14:49 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:14:49 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:14:49.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:14:49 np0005548916 nova_compute[228576]: 2025-12-06 10:14:49.744 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:14:49 np0005548916 nova_compute[228576]: 2025-12-06 10:14:49.762 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:14:49 np0005548916 nova_compute[228576]: 2025-12-06 10:14:49.762 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:14:49 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:49 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e78002000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:50 np0005548916 nova_compute[228576]: 2025-12-06 10:14:50.471 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:14:50 np0005548916 nova_compute[228576]: 2025-12-06 10:14:50.472 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  6 05:14:50 np0005548916 nova_compute[228576]: 2025-12-06 10:14:50.472 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  6 05:14:50 np0005548916 nova_compute[228576]: 2025-12-06 10:14:50.492 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  6 05:14:50 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:50 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e58001840 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:50 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:50 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e6c003730 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:50 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:14:50 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:14:50 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:14:50.946 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:14:51 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:14:51 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:14:51 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:14:51.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:14:51 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:14:51.231 141446 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=12, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '9a:dc:0d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'b6:0a:c4:b8:be:39'}, ipsec=False) old=SB_Global(nb_cfg=11) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  6 05:14:51 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:14:51.234 141446 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  6 05:14:51 np0005548916 nova_compute[228576]: 2025-12-06 10:14:51.470 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:14:51 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:51 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e60002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:52 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:52 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e78002000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:52 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:52 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e58001840 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:52 np0005548916 podman[236625]: 2025-12-06 10:14:52.810304361 +0000 UTC m=+0.112062042 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  6 05:14:52 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:14:52 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:14:52 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:14:52.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:14:53 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:14:53 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:14:53 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:14:53.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:14:53 np0005548916 podman[236756]: 2025-12-06 10:14:53.322919497 +0000 UTC m=+0.068640728 container exec 500f8c89b5c281d0ace139835b07ea5bc6259ce3e028d8284d57da97424b55e0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-crash-compute-1, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.build-date=20250325, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default)
Dec  6 05:14:53 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Dec  6 05:14:53 np0005548916 podman[236756]: 2025-12-06 10:14:53.425537355 +0000 UTC m=+0.171258566 container exec_died 500f8c89b5c281d0ace139835b07ea5bc6259ce3e028d8284d57da97424b55e0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-crash-compute-1, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  6 05:14:53 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:53 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e6c003730 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:53 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:14:53 np0005548916 podman[236874]: 2025-12-06 10:14:53.961214821 +0000 UTC m=+0.104100675 container exec 6af22af7046e22bedbb2fb280e4d2c530c5b3cac3959f396bf7fe3d14752a7eb (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec  6 05:14:53 np0005548916 podman[236874]: 2025-12-06 10:14:53.974595322 +0000 UTC m=+0.117481176 container exec_died 6af22af7046e22bedbb2fb280e4d2c530c5b3cac3959f396bf7fe3d14752a7eb (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec  6 05:14:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:14:54.289 141446 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:14:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:14:54.289 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:14:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:14:54.290 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:14:54 np0005548916 podman[236990]: 2025-12-06 10:14:54.306167622 +0000 UTC m=+0.060910068 container exec e1672b0d6c65fac4dad8abe557390306766af98aa8142ef347d33cd29910d02b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_REF=squid, io.buildah.version=1.40.1, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default)
Dec  6 05:14:54 np0005548916 podman[236990]: 2025-12-06 10:14:54.314770944 +0000 UTC m=+0.069513370 container exec_died e1672b0d6c65fac4dad8abe557390306766af98aa8142ef347d33cd29910d02b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default)
Dec  6 05:14:54 np0005548916 podman[237055]: 2025-12-06 10:14:54.543428698 +0000 UTC m=+0.054923200 container exec 70891cd2190622057f9c45299e27938f7b2105f0244eda3658dedfb18fed50f0 (image=quay.io/ceph/haproxy:2.3, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd)
Dec  6 05:14:54 np0005548916 podman[237055]: 2025-12-06 10:14:54.55766896 +0000 UTC m=+0.069163472 container exec_died 70891cd2190622057f9c45299e27938f7b2105f0244eda3658dedfb18fed50f0 (image=quay.io/ceph/haproxy:2.3, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd)
Dec  6 05:14:54 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:54 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e6c003730 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:54 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:54 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e78002000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:54 np0005548916 podman[237122]: 2025-12-06 10:14:54.784852138 +0000 UTC m=+0.052800667 container exec c8ec7212805c01399bc295ce2c5e69b11fbde393e887859b5ab336e81cd6d1f1 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-1-uzbtlt, architecture=x86_64, vendor=Red Hat, Inc., name=keepalived, distribution-scope=public, io.k8s.display-name=Keepalived on RHEL 9, version=2.2.4, vcs-type=git, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.28.2, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides keepalived on RHEL 9 for Ceph., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.openshift.tags=Ceph keepalived, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=keepalived-container, build-date=2023-02-22T09:23:20, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, description=keepalived for Ceph, release=1793)
Dec  6 05:14:54 np0005548916 podman[237122]: 2025-12-06 10:14:54.803433277 +0000 UTC m=+0.071381806 container exec_died c8ec7212805c01399bc295ce2c5e69b11fbde393e887859b5ab336e81cd6d1f1 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-1-uzbtlt, summary=Provides keepalived on RHEL 9 for Ceph., version=2.2.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=keepalived, com.redhat.component=keepalived-container, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=Ceph keepalived, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.28.2, build-date=2023-02-22T09:23:20, release=1793, io.k8s.display-name=Keepalived on RHEL 9, architecture=x86_64, vendor=Red Hat, Inc., description=keepalived for Ceph)
Dec  6 05:14:54 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:14:54 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:14:54 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:14:54.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:14:55 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:14:55 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:14:55 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:14:55.071 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:14:55 np0005548916 nova_compute[228576]: 2025-12-06 10:14:55.465 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:14:55 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:55 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e58002cd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:55 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:14:55 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:14:55 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:14:55 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:14:55 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Dec  6 05:14:56 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:14:56.236 141446 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=61eba479-a995-4b31-88b9-8ebfcea9907e, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '12'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  6 05:14:56 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:56 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e6c003730 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:56 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:56 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e60003fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:56 np0005548916 podman[237238]: 2025-12-06 10:14:56.753356195 +0000 UTC m=+0.056824606 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Dec  6 05:14:56 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Dec  6 05:14:56 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  6 05:14:56 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:14:56 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:14:56 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  6 05:14:56 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:14:56 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:14:56 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:14:56.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:14:57 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:14:57 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:14:57 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:14:57.074 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:14:57 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:57 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e60003fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:58 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:58 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e58002cd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:58 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:58 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e6c004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:58 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:14:58 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:14:58 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:14:58 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:14:58.958 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:14:59 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:14:59 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:14:59 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:14:59.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:14:59 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:59 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e78002000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:00 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:00 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e60003fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:00 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:00 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e58002cd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:00 np0005548916 podman[237259]: 2025-12-06 10:15:00.75508945 +0000 UTC m=+0.063494751 container health_status ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd)
Dec  6 05:15:00 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:15:00 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:15:00 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:15:00.961 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:15:01 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:15:01 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:15:01 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:15:01.080 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:15:01 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:01 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e6c004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:01 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:15:01 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:15:02 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:02 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e78002000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:02 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:02 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e60003fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:02 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:15:02 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:15:02 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:15:02.965 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:15:03 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:15:03 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:15:03 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:15:03.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:15:03 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:03 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e58003dd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:03 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:15:04 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:04 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e6c004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:04 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:04 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e78002000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:04 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:15:04 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:15:04 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:15:04.968 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:15:05 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:15:05 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:15:05 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:15:05.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:15:05 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:05 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e60003fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:06 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:06 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e58003dd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:06 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:06 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e6c004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:06 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:15:06 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:15:06 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:15:06.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:15:07 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:15:07 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:15:07 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:15:07.087 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:15:07 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:07 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e78002000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:08 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:08 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e60003fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:08 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:08 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e58003dd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:08 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:15:08 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:15:08 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:15:08 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:15:08.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:15:09 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:15:09 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:15:09 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:15:09.089 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:15:09 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:09 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e6c004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:10 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:10 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e78002000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:10 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:10 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e60003fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:10 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:15:10 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:15:10 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:15:10.977 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:15:11 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:15:11 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:15:11 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:15:11.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:15:11 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:11 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e58003dd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:12 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:12 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e6c004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:12 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:12 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e840013a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:12 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:15:12 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:15:12 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:15:12.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:15:13 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:15:13 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:15:13 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:15:13.092 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:15:13 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:13 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e60003fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:13 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:15:14 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:14 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e64001140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:14 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:14 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e6c004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:14 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:15:14 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:15:14 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:15:14.982 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:15:15 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:15:15 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:15:15 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:15:15.094 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:15:15 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:15 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e6c004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:16 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:16 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e60003fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:16 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:16 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e64001c40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:16 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:15:16 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:15:16 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:15:16.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:15:17 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:15:17 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:15:17 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:15:17.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:15:17 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:17 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e84001eb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:18 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:18 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e60003fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:18 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:18 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e6c004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:18 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:15:18 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:15:18 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:15:18 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:15:18.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:15:19 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:15:19 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:15:19 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:15:19.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:15:19 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:19 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e6c004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:20 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:20 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e840027d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:20 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:20 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e60003fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:20 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:15:20 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:15:20 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:15:20.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:15:21 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:15:21 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:15:21 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:15:21.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:15:21 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:21 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e6c004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:22 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:22 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e6c004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:22 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:22 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e6c004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:22 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:15:22 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:15:22 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:15:22.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:15:23 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:15:23 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:15:23 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:15:23.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:15:23 np0005548916 podman[237344]: 2025-12-06 10:15:23.78825523 +0000 UTC m=+0.088154241 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec  6 05:15:23 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:23 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e54000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:23 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:15:24 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:24 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e840027d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:24 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:24 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e64002560 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:24 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:15:25 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:15:25 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:15:24.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:15:25 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:15:25 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:15:25 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:15:25.106 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:15:25 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:25 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e64002560 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:26 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:26 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e540016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:26 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:26 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e840027d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:27 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:15:27 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:15:27 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:15:27.001 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:15:27 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:15:27 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:15:27 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:15:27.109 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:15:27 np0005548916 podman[237373]: 2025-12-06 10:15:27.758384674 +0000 UTC m=+0.054769225 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Dec  6 05:15:27 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:27 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e64002560 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:28 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:28 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e6c004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:28 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:28 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e540016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:28 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:15:29 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:15:29 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:15:29 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:15:29.004 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:15:29 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:15:29 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:15:29 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:15:29.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:15:29 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:29 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e84003b20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:30 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:30 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e64003660 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:30 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:30 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e6c004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:31 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:15:31 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:15:31 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:15:31.007 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:15:31 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:15:31 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:15:31 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:15:31.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:15:31 np0005548916 podman[237394]: 2025-12-06 10:15:31.747399556 +0000 UTC m=+0.054033678 container health_status ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, managed_by=edpm_ansible)
Dec  6 05:15:31 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:31 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e540016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:32 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:32 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e84003b20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:32 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:32 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e64003660 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:33 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:15:33 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:15:33 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:15:33.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:15:33 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:15:33 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:15:33 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:15:33.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:15:33 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:33 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e6c004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:33 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:15:34 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:34 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e54002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:34 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:34 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e84003b20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:35 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:15:35 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:15:35 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:15:35.013 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:15:35 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:15:35 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:15:35 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:15:35.120 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:15:35 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:35 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e64003660 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:36 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:36 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e6c004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:36 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:36 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e54002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:37 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:15:37 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:15:37 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:15:37.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:15:37 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:15:37 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:15:37 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:15:37.122 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:15:37 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:37 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e84004830 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:38 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:38 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e64003660 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:38 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:38 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e6c004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:38 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:15:39 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:15:39 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:15:39 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:15:39.019 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:15:39 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:15:39 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:15:39 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:15:39.124 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:15:39 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:39 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e54002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:40 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:40 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e84004830 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:40 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:40 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e64003660 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:41 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:15:41 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:15:41 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:15:41.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:15:41 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:15:41 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:15:41 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:15:41.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:15:41 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:41 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e6c004070 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:42 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:42 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e54003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:42 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:42 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e84004830 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:43 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:15:43 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:15:43 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:15:43.025 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:15:43 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:15:43 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:15:43 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:15:43.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:15:43 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:43 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e64003660 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:43 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:15:44 np0005548916 nova_compute[228576]: 2025-12-06 10:15:44.470 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:15:44 np0005548916 nova_compute[228576]: 2025-12-06 10:15:44.471 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  6 05:15:44 np0005548916 nova_compute[228576]: 2025-12-06 10:15:44.471 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:15:44 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:44 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e6c004070 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:44 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:44 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e54003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:45 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:15:45 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:15:45 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:15:45.029 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:15:45 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:15:45 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:15:45 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:15:45.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:15:45 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:45 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e84004830 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:46 np0005548916 nova_compute[228576]: 2025-12-06 10:15:46.480 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:15:46 np0005548916 nova_compute[228576]: 2025-12-06 10:15:46.481 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:15:46 np0005548916 nova_compute[228576]: 2025-12-06 10:15:46.481 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Dec  6 05:15:46 np0005548916 nova_compute[228576]: 2025-12-06 10:15:46.509 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Dec  6 05:15:46 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:46 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e64003660 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:46 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:46 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e6c004090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:47 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:15:47 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:15:47 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:15:47.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:15:47 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:15:47 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:15:47 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:15:47.134 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:15:47 np0005548916 nova_compute[228576]: 2025-12-06 10:15:47.470 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:15:47 np0005548916 nova_compute[228576]: 2025-12-06 10:15:47.471 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Dec  6 05:15:47 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:47 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e54003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:48 np0005548916 nova_compute[228576]: 2025-12-06 10:15:48.494 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:15:48 np0005548916 nova_compute[228576]: 2025-12-06 10:15:48.494 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:15:48 np0005548916 nova_compute[228576]: 2025-12-06 10:15:48.539 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:15:48 np0005548916 nova_compute[228576]: 2025-12-06 10:15:48.540 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:15:48 np0005548916 nova_compute[228576]: 2025-12-06 10:15:48.540 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:15:48 np0005548916 nova_compute[228576]: 2025-12-06 10:15:48.540 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  6 05:15:48 np0005548916 nova_compute[228576]: 2025-12-06 10:15:48.541 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:15:48 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:48 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e84004830 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:48 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:48 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e64003660 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:48 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:15:48 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  6 05:15:48 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2620383535' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  6 05:15:49 np0005548916 nova_compute[228576]: 2025-12-06 10:15:49.013 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.472s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:15:49 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:15:49 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:15:49 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:15:49.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:15:49 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:15:49 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:15:49 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:15:49.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:15:49 np0005548916 nova_compute[228576]: 2025-12-06 10:15:49.213 228580 WARNING nova.virt.libvirt.driver [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  6 05:15:49 np0005548916 nova_compute[228576]: 2025-12-06 10:15:49.214 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5193MB free_disk=59.96738052368164GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  6 05:15:49 np0005548916 nova_compute[228576]: 2025-12-06 10:15:49.215 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:15:49 np0005548916 nova_compute[228576]: 2025-12-06 10:15:49.215 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:15:49 np0005548916 nova_compute[228576]: 2025-12-06 10:15:49.777 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  6 05:15:49 np0005548916 nova_compute[228576]: 2025-12-06 10:15:49.778 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  6 05:15:49 np0005548916 nova_compute[228576]: 2025-12-06 10:15:49.837 228580 DEBUG nova.scheduler.client.report [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Refreshing inventories for resource provider ff2f17cb-ff1d-4da7-9560-4be741380cb1 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Dec  6 05:15:49 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:49 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e6c0040b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:49 np0005548916 nova_compute[228576]: 2025-12-06 10:15:49.905 228580 DEBUG nova.scheduler.client.report [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Updating ProviderTree inventory for provider ff2f17cb-ff1d-4da7-9560-4be741380cb1 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Dec  6 05:15:49 np0005548916 nova_compute[228576]: 2025-12-06 10:15:49.906 228580 DEBUG nova.compute.provider_tree [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Updating inventory in ProviderTree for provider ff2f17cb-ff1d-4da7-9560-4be741380cb1 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Dec  6 05:15:49 np0005548916 nova_compute[228576]: 2025-12-06 10:15:49.932 228580 DEBUG nova.scheduler.client.report [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Refreshing aggregate associations for resource provider ff2f17cb-ff1d-4da7-9560-4be741380cb1, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Dec  6 05:15:49 np0005548916 nova_compute[228576]: 2025-12-06 10:15:49.952 228580 DEBUG nova.scheduler.client.report [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Refreshing trait associations for resource provider ff2f17cb-ff1d-4da7-9560-4be741380cb1, traits: COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_ACCELERATORS,HW_CPU_X86_SVM,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_AESNI,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSSE3,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSE4A,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_FMA3,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_AVX,HW_CPU_X86_BMI2,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SHA,HW_CPU_X86_SSE2,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_AMD_SVM,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_F16C,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_ABM,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NODE,COMPUTE_SECURITY_TPM_2_0,COMPUTE_RESCUE_BFV,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SECURITY_TPM_1_2,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_CLMUL,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_AVX2,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_BMI,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_SOCKET_PCI_NUMA_AFFINITY _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Dec  6 05:15:49 np0005548916 nova_compute[228576]: 2025-12-06 10:15:49.968 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:15:50 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  6 05:15:50 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4091665223' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  6 05:15:50 np0005548916 nova_compute[228576]: 2025-12-06 10:15:50.476 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.509s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:15:50 np0005548916 nova_compute[228576]: 2025-12-06 10:15:50.483 228580 DEBUG nova.compute.provider_tree [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Inventory has not changed in ProviderTree for provider: ff2f17cb-ff1d-4da7-9560-4be741380cb1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  6 05:15:50 np0005548916 nova_compute[228576]: 2025-12-06 10:15:50.503 228580 DEBUG nova.scheduler.client.report [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Inventory has not changed for provider ff2f17cb-ff1d-4da7-9560-4be741380cb1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  6 05:15:50 np0005548916 nova_compute[228576]: 2025-12-06 10:15:50.506 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  6 05:15:50 np0005548916 nova_compute[228576]: 2025-12-06 10:15:50.507 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.292s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:15:50 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:50 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e54003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:50 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:50 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e84004830 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:51 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:15:51 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:15:51 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:15:51.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:15:51 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:15:51 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:15:51 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:15:51.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:15:51 np0005548916 nova_compute[228576]: 2025-12-06 10:15:51.485 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:15:51 np0005548916 nova_compute[228576]: 2025-12-06 10:15:51.486 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:15:51 np0005548916 nova_compute[228576]: 2025-12-06 10:15:51.486 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:15:51 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:51 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e64003660 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:52 np0005548916 nova_compute[228576]: 2025-12-06 10:15:52.472 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:15:52 np0005548916 nova_compute[228576]: 2025-12-06 10:15:52.472 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  6 05:15:52 np0005548916 nova_compute[228576]: 2025-12-06 10:15:52.472 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  6 05:15:52 np0005548916 nova_compute[228576]: 2025-12-06 10:15:52.488 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  6 05:15:52 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:52 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e6c0040d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:52 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:52 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e54003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:53 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:15:53 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:15:53 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:15:53.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:15:53 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:15:53 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:15:53 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:15:53.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:15:53 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:53 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e84004830 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:53 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:15:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:15:54.290 141446 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:15:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:15:54.291 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:15:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:15:54.291 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:15:54 np0005548916 podman[237521]: 2025-12-06 10:15:54.39198066 +0000 UTC m=+0.081570058 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3)
Dec  6 05:15:54 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:54 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e64003660 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:54 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:54 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e58002830 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:55 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:15:55 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:15:55 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:15:55.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:15:55 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:15:55 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:15:55 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:15:55.143 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:15:55 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:55 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e54003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:56 np0005548916 nova_compute[228576]: 2025-12-06 10:15:56.480 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:15:56 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:56 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e60001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:56 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:56 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e64003660 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:57 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:15:57 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:15:57 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:15:57.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:15:57 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:15:57 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:15:57 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:15:57.147 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:15:57 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:57 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e58002830 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:58 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:58 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e54003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:58 np0005548916 podman[237552]: 2025-12-06 10:15:58.778993562 +0000 UTC m=+0.073100978 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team)
Dec  6 05:15:58 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:58 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e60001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:58 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:15:59 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:15:59 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:15:59 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:15:59.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:15:59 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:15:59 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:15:59 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:15:59.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:15:59 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:59 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e64003660 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:16:00 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:00 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e58002830 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:16:00 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:00 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e54003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:16:01 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:16:01 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:16:01 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:16:01.051 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:16:01 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:16:01 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:16:01 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:16:01.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:16:01 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:16:01.401 141446 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=13, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '9a:dc:0d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'b6:0a:c4:b8:be:39'}, ipsec=False) old=SB_Global(nb_cfg=12) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  6 05:16:01 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:16:01.402 141446 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  6 05:16:01 np0005548916 podman[237622]: 2025-12-06 10:16:01.883584784 +0000 UTC m=+0.062250560 container health_status ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec  6 05:16:01 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:01 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e60002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:16:02 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:02 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e64003660 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:16:02 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:02 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e58002830 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:16:03 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:16:03 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:16:03 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:16:03.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:16:03 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:16:03 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:16:03 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:16:03.155 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:16:03 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:03 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e54003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:16:03 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:16:04 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:04 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e54003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:16:04 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:04 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e64003660 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:16:05 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:16:05 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:16:05 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:16:05.057 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:16:05 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:16:05 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:16:05 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:16:05.157 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:16:05 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:05 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e58002830 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:16:06 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:16:06 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:16:06 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  6 05:16:06 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:16:06 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:16:06 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  6 05:16:06 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:06 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e54003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:16:06 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:06 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e54003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:16:07 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:16:07 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:16:07 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:16:07.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:16:07 np0005548916 ceph-mon[79770]: Health check cleared: CEPHADM_FAILED_DAEMON (was: 1 failed cephadm daemon(s))
Dec  6 05:16:07 np0005548916 ceph-mon[79770]: Cluster is now healthy
Dec  6 05:16:07 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:16:07 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:16:07 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:16:07.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:16:07 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:07 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e64003660 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:16:08 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:08 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e58003650 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:16:08 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:08 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e54003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:16:08 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:16:09 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:16:09 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:16:09 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:16:09.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:16:09 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:16:09 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:16:09 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:16:09.163 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:16:09 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:09 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e60003240 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:16:09 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:16:10 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:10 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e64003660 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:16:10 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:10 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e58003650 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:16:11 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:16:11 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:16:11 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:16:11 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:16:11 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:16:11.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:16:11 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:16:11 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:16:11 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:16:11.166 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:16:11 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:16:11.403 141446 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=61eba479-a995-4b31-88b9-8ebfcea9907e, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '13'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  6 05:16:11 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:11 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e54003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:16:12 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:12 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e600033e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:16:12 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:12 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e64003660 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:16:13 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:16:13 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:16:13 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:16:13.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:16:13 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:16:13 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:16:13 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:16:13.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:16:13 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:13 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e58003650 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:16:13 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:16:14 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:14 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e54003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:16:14 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:14 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e60003e70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:16:15 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:16:15 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:16:15 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:16:15.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:16:15 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:16:15 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:16:15 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:16:15.172 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:16:15 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:15 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e64003660 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:16:16 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:16 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e58003650 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:16:16 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:16 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e54003c30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:16:17 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:16:17 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:16:17 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:16:17.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:16:17 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:16:17 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:16:17 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:16:17.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:16:17 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:17 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e60003e70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:16:17 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/101617 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  6 05:16:18 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:18 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e64003660 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:16:18 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:18 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e58003650 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:16:18 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:16:19 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:16:19 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:16:19 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:16:19.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:16:19 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:16:19 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:16:19 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:16:19.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:16:19 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:19 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e54003c50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:16:20 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:20 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e60003e70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:16:20 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:20 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e64003660 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:16:21 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:16:21 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:16:21 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:16:21.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:16:21 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:16:21 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:16:21 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:16:21.180 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:16:21 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:21 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e58003650 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:16:22 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:22 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e58003650 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:16:22 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:22 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e60003e70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:16:23 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:16:23 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:16:23 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:16:23.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:16:23 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:16:23 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:16:23 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:16:23.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:16:23 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:23 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e64004370 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:16:23 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:16:24 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:24 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e58003650 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:16:24 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:24 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e6c0014d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:16:24 np0005548916 podman[237741]: 2025-12-06 10:16:24.825702687 +0000 UTC m=+0.129256627 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  6 05:16:25 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:16:25 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:16:25 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:16:25.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:16:25 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:16:25 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:16:25 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:16:25.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:16:25 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:25 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e60003e70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:16:26 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:26 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e64004370 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:16:26 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:26 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e84001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:16:27 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:16:27 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:16:27 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:16:27.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:16:27 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:16:27 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:16:27 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:16:27.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:16:27 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:27 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e6c0014d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:16:28 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:28 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e60003e70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:16:28 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:28 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e64004370 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:16:28 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:16:29 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:16:29 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:16:29 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:16:29.089 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:16:29 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:16:29 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:16:29 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:16:29.194 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:16:29 np0005548916 podman[237770]: 2025-12-06 10:16:29.78004495 +0000 UTC m=+0.077125738 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Dec  6 05:16:29 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:29 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e64004370 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:16:30 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:30 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e84001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:16:30 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:30 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e60003e70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:16:31 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:16:31 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:16:31 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:16:31.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:16:31 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:16:31 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:16:31 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:16:31.196 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:16:31 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:31 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e6c0014d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:16:32 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:32 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e64004370 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:16:32 np0005548916 podman[237791]: 2025-12-06 10:16:32.74597706 +0000 UTC m=+0.053804691 container health_status ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Dec  6 05:16:32 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:32 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e840030a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:16:33 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:16:33 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:16:33 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:16:33.092 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:16:33 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:16:33 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:16:33 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:16:33.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:16:33 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:33 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e60003e70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:16:33 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:16:34 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:34 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e6c001670 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:16:34 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:34 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e64004370 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:16:35 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:16:35 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:16:35 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:16:35.094 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:16:35 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:16:35 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:16:35 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:16:35.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:16:35 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:35 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e64004370 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:16:36 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:36 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e64004370 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:16:36 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:36 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e6c001670 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:16:37 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:16:37 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:16:37 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:16:37.096 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:16:37 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:16:37 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:16:37 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:16:37.204 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:16:37 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:37 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e60003e70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:16:38 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:38 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e840030a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:16:38 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:38 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e64004370 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:16:38 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:16:39 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:16:39 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:16:39 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:16:39.098 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:16:39 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:16:39 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:16:39 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:16:39.207 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:16:39 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:39 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e6c001670 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:16:40 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:40 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e60003e70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:16:40 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/101640 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  6 05:16:40 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:40 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e840030a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:16:41 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:16:41 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:16:41 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:16:41.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:16:41 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:16:41 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:16:41 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:16:41.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:16:41 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:41 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e64004370 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:16:42 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:42 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e6c001670 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:16:42 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:42 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e60003e70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:16:43 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:16:43 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:16:43 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:16:43.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:16:43 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:16:43 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:16:43 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:16:43.213 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:16:43 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:43 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e840030a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:16:43 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:16:44 np0005548916 nova_compute[228576]: 2025-12-06 10:16:44.471 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:16:44 np0005548916 nova_compute[228576]: 2025-12-06 10:16:44.471 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  6 05:16:44 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:44 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e64004370 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:16:44 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:44 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e6c001670 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:16:45 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:16:45 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:16:45 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:16:45.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:16:45 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:16:45 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:16:45 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:16:45.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:16:45 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:45 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e60003e70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:16:46 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:46 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e840030a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:16:46 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:46 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e840030a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:16:47 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:16:47 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:16:47 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:16:47.106 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:16:47 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:16:47 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:16:47 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:16:47.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:16:47 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:47 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e60003e70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:16:48 np0005548916 nova_compute[228576]: 2025-12-06 10:16:48.464 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:16:48 np0005548916 nova_compute[228576]: 2025-12-06 10:16:48.505 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:16:48 np0005548916 nova_compute[228576]: 2025-12-06 10:16:48.505 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:16:48 np0005548916 nova_compute[228576]: 2025-12-06 10:16:48.530 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:16:48 np0005548916 nova_compute[228576]: 2025-12-06 10:16:48.530 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:16:48 np0005548916 nova_compute[228576]: 2025-12-06 10:16:48.530 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:16:48 np0005548916 nova_compute[228576]: 2025-12-06 10:16:48.530 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  6 05:16:48 np0005548916 nova_compute[228576]: 2025-12-06 10:16:48.531 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:16:48 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:48 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e6c001670 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:16:48 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:48 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e64004370 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:16:48 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:16:48 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  6 05:16:48 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2009192655' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  6 05:16:49 np0005548916 nova_compute[228576]: 2025-12-06 10:16:49.011 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.481s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:16:49 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:16:49 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:16:49 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:16:49.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:16:49 np0005548916 nova_compute[228576]: 2025-12-06 10:16:49.156 228580 WARNING nova.virt.libvirt.driver [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  6 05:16:49 np0005548916 nova_compute[228576]: 2025-12-06 10:16:49.157 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5175MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  6 05:16:49 np0005548916 nova_compute[228576]: 2025-12-06 10:16:49.158 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:16:49 np0005548916 nova_compute[228576]: 2025-12-06 10:16:49.158 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:16:49 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:16:49 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:16:49 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:16:49.220 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:16:49 np0005548916 nova_compute[228576]: 2025-12-06 10:16:49.227 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  6 05:16:49 np0005548916 nova_compute[228576]: 2025-12-06 10:16:49.228 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  6 05:16:49 np0005548916 nova_compute[228576]: 2025-12-06 10:16:49.248 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:16:49 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  6 05:16:49 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4057191728' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  6 05:16:49 np0005548916 nova_compute[228576]: 2025-12-06 10:16:49.699 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.451s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:16:49 np0005548916 nova_compute[228576]: 2025-12-06 10:16:49.706 228580 DEBUG nova.compute.provider_tree [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Inventory has not changed in ProviderTree for provider: ff2f17cb-ff1d-4da7-9560-4be741380cb1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  6 05:16:49 np0005548916 nova_compute[228576]: 2025-12-06 10:16:49.720 228580 DEBUG nova.scheduler.client.report [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Inventory has not changed for provider ff2f17cb-ff1d-4da7-9560-4be741380cb1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  6 05:16:49 np0005548916 nova_compute[228576]: 2025-12-06 10:16:49.721 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  6 05:16:49 np0005548916 nova_compute[228576]: 2025-12-06 10:16:49.722 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.564s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:16:49 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:49 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e64004370 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:16:50 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:50 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e78000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:16:50 np0005548916 nova_compute[228576]: 2025-12-06 10:16:50.688 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:16:50 np0005548916 nova_compute[228576]: 2025-12-06 10:16:50.689 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:16:50 np0005548916 nova_compute[228576]: 2025-12-06 10:16:50.689 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:16:50 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:50 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e58003650 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:16:51 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:16:51 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:16:51 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:16:51.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:16:51 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:16:51 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:16:51 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:16:51.222 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:16:51 np0005548916 nova_compute[228576]: 2025-12-06 10:16:51.471 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:16:51 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:51 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e6c0048d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:16:52 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:52 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e64004370 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:16:52 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:52 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e78001a00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:16:53 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:16:53 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:16:53 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:16:53.112 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:16:53 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:16:53 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:16:53 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:16:53.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:16:53 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:53 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e78001a00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:16:53 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:16:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:16:54.291 141446 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:16:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:16:54.291 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:16:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:16:54.291 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:16:54 np0005548916 nova_compute[228576]: 2025-12-06 10:16:54.471 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:16:54 np0005548916 nova_compute[228576]: 2025-12-06 10:16:54.472 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  6 05:16:54 np0005548916 nova_compute[228576]: 2025-12-06 10:16:54.472 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  6 05:16:54 np0005548916 nova_compute[228576]: 2025-12-06 10:16:54.500 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  6 05:16:54 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:54 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e60003e70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:16:54 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:54 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e6c0048d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:16:55 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:16:55 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:16:55 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:16:55.114 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:16:55 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:16:55 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:16:55 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:16:55.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:16:55 np0005548916 podman[237919]: 2025-12-06 10:16:55.802255348 +0000 UTC m=+0.107267694 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, org.label-schema.vendor=CentOS)
Dec  6 05:16:55 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:55 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e64004370 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:16:56 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:56 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e78001a00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:16:56 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:56 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e60003e70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:16:57 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:16:57 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:16:57 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:16:57.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:16:57 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:16:57 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:16:57 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:16:57.231 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:16:57 np0005548916 nova_compute[228576]: 2025-12-06 10:16:57.492 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:16:57 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:57 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e6c0048d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:16:58 np0005548916 kernel: ganesha.nfsd[237309]: segfault at 50 ip 00007f9f37c5f32e sp 00007f9ef57f9210 error 4 in libntirpc.so.5.8[7f9f37c44000+2c000] likely on CPU 1 (core 0, socket 1)
Dec  6 05:16:58 np0005548916 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Dec  6 05:16:58 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:58 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e64004370 fd 38 proxy ignored for local
Dec  6 05:16:58 np0005548916 systemd[1]: Started Process Core Dump (PID 237947/UID 0).
Dec  6 05:16:58 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:16:59 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:16:59 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:16:59 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:16:59.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:16:59 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:16:59 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:16:59 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:16:59.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:17:00 np0005548916 systemd-coredump[237948]: Process 236431 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 57:#012#0  0x00007f9f37c5f32e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Dec  6 05:17:00 np0005548916 systemd[1]: systemd-coredump@10-237947-0.service: Deactivated successfully.
Dec  6 05:17:00 np0005548916 systemd[1]: systemd-coredump@10-237947-0.service: Consumed 1.507s CPU time.
Dec  6 05:17:00 np0005548916 podman[237955]: 2025-12-06 10:17:00.391368037 +0000 UTC m=+0.033829528 container died e1672b0d6c65fac4dad8abe557390306766af98aa8142ef347d33cd29910d02b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, CEPH_REF=squid, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.build-date=20250325)
Dec  6 05:17:00 np0005548916 systemd[1]: var-lib-containers-storage-overlay-8d8a693bbb4eb50dbf44c219f4881afeb798ed99392b659e6bc95aa1a478f7c4-merged.mount: Deactivated successfully.
Dec  6 05:17:00 np0005548916 podman[237955]: 2025-12-06 10:17:00.445373492 +0000 UTC m=+0.087834983 container remove e1672b0d6c65fac4dad8abe557390306766af98aa8142ef347d33cd29910d02b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default)
Dec  6 05:17:00 np0005548916 podman[237954]: 2025-12-06 10:17:00.449227578 +0000 UTC m=+0.082245695 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Dec  6 05:17:00 np0005548916 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.0.0.compute-1.djsnbu.service: Main process exited, code=exited, status=139/n/a
Dec  6 05:17:00 np0005548916 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.0.0.compute-1.djsnbu.service: Failed with result 'exit-code'.
Dec  6 05:17:00 np0005548916 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.0.0.compute-1.djsnbu.service: Consumed 1.790s CPU time.
Dec  6 05:17:01 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:17:01 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:17:01 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:17:01.121 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:17:01 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:17:01 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:17:01 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:17:01.235 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:17:03 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:17:03 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:17:03 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:17:03.122 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:17:03 np0005548916 podman[238018]: 2025-12-06 10:17:03.157035607 +0000 UTC m=+0.066106076 container health_status ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=multipathd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3)
Dec  6 05:17:03 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:17:03 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:17:03 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:17:03.237 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:17:03 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:17:04 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/101704 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 0 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  6 05:17:04 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [NOTICE] 339/101704 (4) : haproxy version is 2.3.17-d1c9119
Dec  6 05:17:04 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [NOTICE] 339/101704 (4) : path to executable is /usr/local/sbin/haproxy
Dec  6 05:17:04 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [ALERT] 339/101704 (4) : backend 'backend' has no server available!
Dec  6 05:17:05 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:17:05 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:17:05 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:17:05.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:17:05 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:17:05 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:17:05 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:17:05.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:17:07 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:17:07 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:17:07 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:17:07.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:17:07 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:17:07 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:17:07 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:17:07.244 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:17:08 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:17:09 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:17:09 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:17:09 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:17:09.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:17:09 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:17:09 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:17:09 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:17:09.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:17:10 np0005548916 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.0.0.compute-1.djsnbu.service: Scheduled restart job, restart counter is at 11.
Dec  6 05:17:10 np0005548916 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.djsnbu for 5ecd3f74-dade-5fc4-92ce-8950ae424258.
Dec  6 05:17:10 np0005548916 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.0.0.compute-1.djsnbu.service: Consumed 1.790s CPU time.
Dec  6 05:17:10 np0005548916 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.djsnbu for 5ecd3f74-dade-5fc4-92ce-8950ae424258...
Dec  6 05:17:10 np0005548916 podman[238093]: 2025-12-06 10:17:10.915850758 +0000 UTC m=+0.041348193 container create 044fb2629765feb8ffd5fd258951cd4533635db83b13cd8de7feeb48e81aeb97 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec  6 05:17:10 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eb22caed735b4396606c1376888db90490624613a4aa87d53e4dc197468a9281/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Dec  6 05:17:10 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eb22caed735b4396606c1376888db90490624613a4aa87d53e4dc197468a9281/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  6 05:17:10 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eb22caed735b4396606c1376888db90490624613a4aa87d53e4dc197468a9281/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Dec  6 05:17:10 np0005548916 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eb22caed735b4396606c1376888db90490624613a4aa87d53e4dc197468a9281/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.djsnbu-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Dec  6 05:17:10 np0005548916 podman[238093]: 2025-12-06 10:17:10.977720978 +0000 UTC m=+0.103218433 container init 044fb2629765feb8ffd5fd258951cd4533635db83b13cd8de7feeb48e81aeb97 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  6 05:17:10 np0005548916 podman[238093]: 2025-12-06 10:17:10.982426274 +0000 UTC m=+0.107923709 container start 044fb2629765feb8ffd5fd258951cd4533635db83b13cd8de7feeb48e81aeb97 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  6 05:17:10 np0005548916 bash[238093]: 044fb2629765feb8ffd5fd258951cd4533635db83b13cd8de7feeb48e81aeb97
Dec  6 05:17:10 np0005548916 podman[238093]: 2025-12-06 10:17:10.897522025 +0000 UTC m=+0.023019480 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  6 05:17:10 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:17:10 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Dec  6 05:17:10 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:17:10 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Dec  6 05:17:10 np0005548916 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.djsnbu for 5ecd3f74-dade-5fc4-92ce-8950ae424258.
Dec  6 05:17:11 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:17:11 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Dec  6 05:17:11 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:17:11 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Dec  6 05:17:11 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:17:11 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Dec  6 05:17:11 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:17:11 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Dec  6 05:17:11 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:17:11 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Dec  6 05:17:11 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:17:11 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:17:11 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:17:11 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:17:11 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:17:11.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:17:11 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:17:11 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:17:11 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:17:11.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:17:12 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  6 05:17:12 np0005548916 systemd-logind[788]: New session 55 of user zuul.
Dec  6 05:17:12 np0005548916 systemd[1]: Started Session 55 of User zuul.
Dec  6 05:17:13 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:17:13 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:17:13 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  6 05:17:13 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:17:13 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:17:13 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:17:13.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:17:13 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:17:13 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:17:13 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:17:13.251 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:17:13 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:17:15 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:17:15 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:17:15 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:17:15.134 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:17:15 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:17:15 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:17:15 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:17:15.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:17:16 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "status"} v 0)
Dec  6 05:17:16 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3986376339' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Dec  6 05:17:17 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:17:17 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:17:17 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:17:17.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:17:17 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:17:17 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:17:17 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:17:17 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:17:17 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:17:17 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:17:17 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:17:17 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:17:17 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:17:17.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:17:18 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:17:18 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:17:18 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:17:19 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:17:19 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:17:19 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:17:19.138 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:17:19 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:17:19 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:17:19 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:17:19.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:17:21 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:17:21 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:17:21 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:17:21.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:17:21 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:17:21 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:17:21 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:17:21.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:17:22 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:17:21 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:17:22 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:17:21 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:17:22 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:17:21 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:17:22 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:17:22 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:17:22 np0005548916 ovs-vsctl[238652]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Dec  6 05:17:23 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:17:23 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:17:23 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:17:23.143 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:17:23 np0005548916 virtqemud[228188]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Dec  6 05:17:23 np0005548916 virtqemud[228188]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Dec  6 05:17:23 np0005548916 virtqemud[228188]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Dec  6 05:17:23 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:17:23 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:17:23 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:17:23.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:17:23 np0005548916 ceph-mds[84241]: mds.cephfs.compute-1.fpvjgb asok_command: cache status {prefix=cache status} (starting...)
Dec  6 05:17:23 np0005548916 ceph-mds[84241]: mds.cephfs.compute-1.fpvjgb Can't run that command on an inactive MDS!
Dec  6 05:17:23 np0005548916 lvm[238966]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec  6 05:17:23 np0005548916 lvm[238966]: VG ceph_vg0 finished
Dec  6 05:17:23 np0005548916 ceph-mds[84241]: mds.cephfs.compute-1.fpvjgb asok_command: client ls {prefix=client ls} (starting...)
Dec  6 05:17:23 np0005548916 ceph-mds[84241]: mds.cephfs.compute-1.fpvjgb Can't run that command on an inactive MDS!
Dec  6 05:17:23 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:17:24 np0005548916 ceph-mds[84241]: mds.cephfs.compute-1.fpvjgb asok_command: damage ls {prefix=damage ls} (starting...)
Dec  6 05:17:24 np0005548916 ceph-mds[84241]: mds.cephfs.compute-1.fpvjgb Can't run that command on an inactive MDS!
Dec  6 05:17:24 np0005548916 ceph-mds[84241]: mds.cephfs.compute-1.fpvjgb asok_command: dump loads {prefix=dump loads} (starting...)
Dec  6 05:17:24 np0005548916 ceph-mds[84241]: mds.cephfs.compute-1.fpvjgb Can't run that command on an inactive MDS!
Dec  6 05:17:24 np0005548916 ceph-mds[84241]: mds.cephfs.compute-1.fpvjgb asok_command: dump tree {prefix=dump tree,root=/} (starting...)
Dec  6 05:17:24 np0005548916 ceph-mds[84241]: mds.cephfs.compute-1.fpvjgb Can't run that command on an inactive MDS!
Dec  6 05:17:25 np0005548916 ceph-mds[84241]: mds.cephfs.compute-1.fpvjgb asok_command: dump_blocked_ops {prefix=dump_blocked_ops} (starting...)
Dec  6 05:17:25 np0005548916 ceph-mds[84241]: mds.cephfs.compute-1.fpvjgb Can't run that command on an inactive MDS!
Dec  6 05:17:25 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:17:25 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:17:25 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:17:25.146 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:17:25 np0005548916 ceph-mds[84241]: mds.cephfs.compute-1.fpvjgb asok_command: dump_historic_ops {prefix=dump_historic_ops} (starting...)
Dec  6 05:17:25 np0005548916 ceph-mds[84241]: mds.cephfs.compute-1.fpvjgb Can't run that command on an inactive MDS!
Dec  6 05:17:25 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:17:25 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:17:25 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:17:25.271 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:17:25 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec  6 05:17:25 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1011187621' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec  6 05:17:25 np0005548916 ceph-mds[84241]: mds.cephfs.compute-1.fpvjgb asok_command: dump_historic_ops_by_duration {prefix=dump_historic_ops_by_duration} (starting...)
Dec  6 05:17:25 np0005548916 ceph-mds[84241]: mds.cephfs.compute-1.fpvjgb Can't run that command on an inactive MDS!
Dec  6 05:17:25 np0005548916 ceph-mds[84241]: mds.cephfs.compute-1.fpvjgb asok_command: dump_ops_in_flight {prefix=dump_ops_in_flight} (starting...)
Dec  6 05:17:25 np0005548916 ceph-mds[84241]: mds.cephfs.compute-1.fpvjgb Can't run that command on an inactive MDS!
Dec  6 05:17:25 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "config log"} v 0)
Dec  6 05:17:25 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/901373131' entity='client.admin' cmd=[{"prefix": "config log"}]: dispatch
Dec  6 05:17:25 np0005548916 ceph-mds[84241]: mds.cephfs.compute-1.fpvjgb asok_command: get subtrees {prefix=get subtrees} (starting...)
Dec  6 05:17:25 np0005548916 ceph-mds[84241]: mds.cephfs.compute-1.fpvjgb Can't run that command on an inactive MDS!
Dec  6 05:17:26 np0005548916 ceph-mds[84241]: mds.cephfs.compute-1.fpvjgb asok_command: ops {prefix=ops} (starting...)
Dec  6 05:17:26 np0005548916 ceph-mds[84241]: mds.cephfs.compute-1.fpvjgb Can't run that command on an inactive MDS!
Dec  6 05:17:26 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "config-key dump"} v 0)
Dec  6 05:17:26 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/675051402' entity='client.admin' cmd=[{"prefix": "config-key dump"}]: dispatch
Dec  6 05:17:26 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "log last", "channel": "cephadm"} v 0)
Dec  6 05:17:26 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2227707575' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm"}]: dispatch
Dec  6 05:17:26 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr dump"} v 0)
Dec  6 05:17:26 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/430822803' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Dec  6 05:17:26 np0005548916 ceph-mds[84241]: mds.cephfs.compute-1.fpvjgb asok_command: session ls {prefix=session ls} (starting...)
Dec  6 05:17:26 np0005548916 ceph-mds[84241]: mds.cephfs.compute-1.fpvjgb Can't run that command on an inactive MDS!
Dec  6 05:17:26 np0005548916 podman[239348]: 2025-12-06 10:17:26.806632968 +0000 UTC m=+0.106889985 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec  6 05:17:26 np0005548916 ceph-mds[84241]: mds.cephfs.compute-1.fpvjgb asok_command: status {prefix=status} (starting...)
Dec  6 05:17:27 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:17:26 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:17:27 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:17:26 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:17:27 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:17:26 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:17:27 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:17:27 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:17:27 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr metadata"} v 0)
Dec  6 05:17:27 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3824211357' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Dec  6 05:17:27 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:17:27 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:17:27 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:17:27.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:17:27 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:17:27 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:17:27 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:17:27.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:17:27 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "features"} v 0)
Dec  6 05:17:27 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3343059054' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Dec  6 05:17:27 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr module ls"} v 0)
Dec  6 05:17:27 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1402062969' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Dec  6 05:17:27 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "health", "detail": "detail"} v 0)
Dec  6 05:17:27 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3965562810' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch
Dec  6 05:17:27 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr services"} v 0)
Dec  6 05:17:27 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1821647303' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Dec  6 05:17:28 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr stat"} v 0)
Dec  6 05:17:28 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/912603139' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch
Dec  6 05:17:28 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"} v 0)
Dec  6 05:17:28 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1283122563' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"}]: dispatch
Dec  6 05:17:28 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr versions"} v 0)
Dec  6 05:17:28 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/357451755' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Dec  6 05:17:28 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:17:29 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:17:29 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.002000049s ======
Dec  6 05:17:29 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:17:29.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000049s
Dec  6 05:17:29 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"} v 0)
Dec  6 05:17:29 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/447826151' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Dec  6 05:17:29 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:17:29 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:17:29 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:17:29.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:17:29 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr dump"} v 0)
Dec  6 05:17:29 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4036683984' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Dec  6 05:17:30 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr metadata"} v 0)
Dec  6 05:17:30 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1483867891' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Dec  6 05:17:30 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr module ls"} v 0)
Dec  6 05:17:30 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2197448062' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 145 handle_osd_map epochs [145,146], i have 145, src has [1,146]
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca66000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77512704 unmapped: 1007616 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: mgrc handle_mgr_map Got map version 35
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: mgrc handle_mgr_map Active mgr is now [v2:192.168.122.100:6800/3885409716,v1:192.168.122.100:6801/3885409716]
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77553664 unmapped: 966656 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77553664 unmapped: 966656 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca66000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 924228 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77553664 unmapped: 966656 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77553664 unmapped: 966656 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77553664 unmapped: 966656 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77553664 unmapped: 966656 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca66000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77561856 unmapped: 958464 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 924228 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77570048 unmapped: 950272 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77578240 unmapped: 942080 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca66000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77578240 unmapped: 942080 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77578240 unmapped: 942080 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77586432 unmapped: 933888 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca66000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 924228 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77594624 unmapped: 925696 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77594624 unmapped: 925696 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77602816 unmapped: 917504 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77602816 unmapped: 917504 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca66000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77611008 unmapped: 909312 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 924228 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77619200 unmapped: 901120 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 ms_handle_reset con 0x55fb25ef5800 session 0x55fb24eda000
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 ms_handle_reset con 0x55fb25ef5400 session 0x55fb23db6f00
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77619200 unmapped: 901120 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77627392 unmapped: 892928 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77627392 unmapped: 892928 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca66000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77627392 unmapped: 892928 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 924228 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77651968 unmapped: 868352 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca66000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77651968 unmapped: 868352 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77660160 unmapped: 860160 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77660160 unmapped: 860160 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77668352 unmapped: 851968 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca66000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 924228 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77676544 unmapped: 843776 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca66000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77676544 unmapped: 843776 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 37.000507355s of 37.167388916s, submitted: 29
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77733888 unmapped: 786432 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77733888 unmapped: 786432 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77733888 unmapped: 786432 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 922420 data_alloc: 218103808 data_used: 135168
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77742080 unmapped: 778240 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77742080 unmapped: 778240 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77742080 unmapped: 778240 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77774848 unmapped: 745472 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 ms_handle_reset con 0x55fb24bd3000 session 0x55fb23db7e00
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 ms_handle_reset con 0x55fb235d5c00 session 0x55fb24e98f00
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77774848 unmapped: 745472 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 922420 data_alloc: 218103808 data_used: 135168
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77791232 unmapped: 729088 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77791232 unmapped: 729088 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77791232 unmapped: 729088 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 720896 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 720896 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 922420 data_alloc: 218103808 data_used: 135168
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 712704 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 13.814132690s of 13.842460632s, submitted: 8
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77815808 unmapped: 704512 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77815808 unmapped: 704512 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77824000 unmapped: 696320 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77824000 unmapped: 696320 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 922120 data_alloc: 218103808 data_used: 135168
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77824000 unmapped: 696320 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77832192 unmapped: 688128 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77840384 unmapped: 679936 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77856768 unmapped: 663552 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77873152 unmapped: 647168 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 922420 data_alloc: 218103808 data_used: 135168
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77873152 unmapped: 647168 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77881344 unmapped: 638976 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.166560173s of 11.237159729s, submitted: 6
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77897728 unmapped: 622592 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77897728 unmapped: 622592 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77905920 unmapped: 614400 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 922420 data_alloc: 218103808 data_used: 135168
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77914112 unmapped: 606208 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77914112 unmapped: 606208 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77922304 unmapped: 598016 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77922304 unmapped: 598016 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77930496 unmapped: 589824 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 922272 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77930496 unmapped: 589824 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77930496 unmapped: 589824 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77938688 unmapped: 581632 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77938688 unmapped: 581632 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77946880 unmapped: 573440 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 922272 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77946880 unmapped: 573440 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77955072 unmapped: 565248 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77955072 unmapped: 565248 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77955072 unmapped: 565248 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77963264 unmapped: 557056 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 922272 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77971456 unmapped: 548864 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77971456 unmapped: 548864 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77979648 unmapped: 540672 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77979648 unmapped: 540672 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77987840 unmapped: 532480 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 922272 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77987840 unmapped: 532480 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77987840 unmapped: 532480 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77996032 unmapped: 524288 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77996032 unmapped: 524288 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 78004224 unmapped: 516096 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 922272 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 78004224 unmapped: 516096 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 78012416 unmapped: 507904 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 78012416 unmapped: 507904 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 78012416 unmapped: 507904 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 78020608 unmapped: 499712 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 922272 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 78028800 unmapped: 491520 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 78028800 unmapped: 491520 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 78036992 unmapped: 483328 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 78036992 unmapped: 483328 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 78045184 unmapped: 475136 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 922272 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 78045184 unmapped: 475136 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 78053376 unmapped: 466944 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 78053376 unmapped: 466944 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 78053376 unmapped: 466944 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 78061568 unmapped: 458752 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 922272 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 78061568 unmapped: 458752 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 78061568 unmapped: 458752 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 78069760 unmapped: 450560 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 78069760 unmapped: 450560 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 78077952 unmapped: 442368 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 922272 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 ms_handle_reset con 0x55fb24bd1c00 session 0x55fb26acba40
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 78077952 unmapped: 442368 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 78077952 unmapped: 442368 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 78077952 unmapped: 442368 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 78086144 unmapped: 434176 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 78086144 unmapped: 434176 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 922272 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 78102528 unmapped: 417792 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 78102528 unmapped: 417792 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 78102528 unmapped: 417792 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 78110720 unmapped: 409600 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 78110720 unmapped: 409600 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 922272 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 78118912 unmapped: 401408 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 78118912 unmapped: 401408 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 78127104 unmapped: 393216 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 60.789440155s of 61.372726440s, submitted: 5
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 78135296 unmapped: 385024 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 78135296 unmapped: 385024 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 922404 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 78143488 unmapped: 376832 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 78143488 unmapped: 376832 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 ms_handle_reset con 0x55fb2417cc00 session 0x55fb26aca1e0
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 78143488 unmapped: 376832 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 368640 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 78159872 unmapped: 360448 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 922420 data_alloc: 218103808 data_used: 135168
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 78159872 unmapped: 360448 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 78168064 unmapped: 352256 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 78168064 unmapped: 352256 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 78176256 unmapped: 344064 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 78176256 unmapped: 344064 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.027969360s of 12.041974068s, submitted: 4
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 922288 data_alloc: 218103808 data_used: 135168
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 78184448 unmapped: 335872 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 78184448 unmapped: 335872 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 78192640 unmapped: 327680 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 79241216 unmapped: 327680 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 79241216 unmapped: 327680 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 922420 data_alloc: 218103808 data_used: 135168
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 79257600 unmapped: 311296 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 79257600 unmapped: 311296 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 79257600 unmapped: 311296 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 79265792 unmapped: 303104 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 79265792 unmapped: 303104 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 923932 data_alloc: 218103808 data_used: 135168
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 79265792 unmapped: 303104 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 79273984 unmapped: 294912 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.543926239s of 12.579975128s, submitted: 11
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80322560 unmapped: 294912 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80330752 unmapped: 286720 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80338944 unmapped: 278528 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 923916 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80330752 unmapped: 286720 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80338944 unmapped: 278528 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80338944 unmapped: 278528 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80338944 unmapped: 278528 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80347136 unmapped: 270336 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 923784 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80347136 unmapped: 270336 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80347136 unmapped: 270336 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80355328 unmapped: 262144 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80355328 unmapped: 262144 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80363520 unmapped: 253952 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 ms_handle_reset con 0x55fb235d5c00 session 0x55fb2719dc20
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 923784 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80363520 unmapped: 253952 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80371712 unmapped: 245760 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80371712 unmapped: 245760 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80371712 unmapped: 245760 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80379904 unmapped: 237568 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 923784 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80379904 unmapped: 237568 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80388096 unmapped: 229376 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80388096 unmapped: 229376 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80388096 unmapped: 229376 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80396288 unmapped: 221184 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 923784 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80396288 unmapped: 221184 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 23.654951096s of 23.672401428s, submitted: 5
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80404480 unmapped: 212992 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80404480 unmapped: 212992 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80404480 unmapped: 212992 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80420864 unmapped: 196608 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 926956 data_alloc: 218103808 data_used: 135168
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80420864 unmapped: 196608 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80420864 unmapped: 196608 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80429056 unmapped: 188416 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80470016 unmapped: 147456 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80478208 unmapped: 139264 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 926956 data_alloc: 218103808 data_used: 135168
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80478208 unmapped: 139264 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.962579727s of 10.001787186s, submitted: 11
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80494592 unmapped: 122880 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80502784 unmapped: 114688 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80502784 unmapped: 114688 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80510976 unmapped: 106496 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 926349 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80519168 unmapped: 98304 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80519168 unmapped: 98304 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80527360 unmapped: 90112 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80551936 unmapped: 65536 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80560128 unmapped: 57344 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 926217 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80560128 unmapped: 57344 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80560128 unmapped: 57344 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80568320 unmapped: 49152 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80568320 unmapped: 49152 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80576512 unmapped: 40960 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 926217 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80576512 unmapped: 40960 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80576512 unmapped: 40960 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80584704 unmapped: 32768 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80584704 unmapped: 32768 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80601088 unmapped: 16384 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 926217 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80601088 unmapped: 16384 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80601088 unmapped: 16384 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80609280 unmapped: 8192 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80609280 unmapped: 8192 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80617472 unmapped: 0 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 926217 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80617472 unmapped: 0 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80617472 unmapped: 0 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80625664 unmapped: 1040384 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80625664 unmapped: 1040384 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80633856 unmapped: 1032192 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 926217 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80633856 unmapped: 1032192 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80633856 unmapped: 1032192 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80642048 unmapped: 1024000 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80642048 unmapped: 1024000 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80650240 unmapped: 1015808 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 926217 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80650240 unmapped: 1015808 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80650240 unmapped: 1015808 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80658432 unmapped: 1007616 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80658432 unmapped: 1007616 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80666624 unmapped: 999424 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 926217 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80666624 unmapped: 999424 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80666624 unmapped: 999424 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80674816 unmapped: 991232 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80674816 unmapped: 991232 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80683008 unmapped: 983040 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 926217 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80683008 unmapped: 983040 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80683008 unmapped: 983040 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80691200 unmapped: 974848 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80691200 unmapped: 974848 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80699392 unmapped: 966656 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 926217 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80699392 unmapped: 966656 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80699392 unmapped: 966656 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80707584 unmapped: 958464 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80707584 unmapped: 958464 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80715776 unmapped: 950272 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 926217 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80715776 unmapped: 950272 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80723968 unmapped: 942080 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80723968 unmapped: 942080 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80723968 unmapped: 942080 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80732160 unmapped: 933888 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 926217 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80732160 unmapped: 933888 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80740352 unmapped: 925696 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80740352 unmapped: 925696 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80748544 unmapped: 917504 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80748544 unmapped: 917504 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 926217 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80748544 unmapped: 917504 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80756736 unmapped: 909312 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80756736 unmapped: 909312 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80764928 unmapped: 901120 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80781312 unmapped: 884736 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 926217 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80781312 unmapped: 884736 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80789504 unmapped: 876544 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80789504 unmapped: 876544 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80797696 unmapped: 868352 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80797696 unmapped: 868352 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 926217 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80797696 unmapped: 868352 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80805888 unmapped: 860160 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80805888 unmapped: 860160 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80805888 unmapped: 860160 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80814080 unmapped: 851968 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 926217 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80814080 unmapped: 851968 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80822272 unmapped: 843776 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80822272 unmapped: 843776 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80822272 unmapped: 843776 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80830464 unmapped: 835584 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 926217 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 ms_handle_reset con 0x55fb24bd1c00 session 0x55fb2719d2c0
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80830464 unmapped: 835584 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80830464 unmapped: 835584 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80838656 unmapped: 827392 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80846848 unmapped: 819200 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80846848 unmapped: 819200 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 926217 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80846848 unmapped: 819200 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80855040 unmapped: 811008 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80855040 unmapped: 811008 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80855040 unmapped: 811008 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80863232 unmapped: 802816 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 926217 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80863232 unmapped: 802816 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 95.385993958s of 95.397544861s, submitted: 3
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80871424 unmapped: 794624 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80871424 unmapped: 794624 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80871424 unmapped: 794624 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80879616 unmapped: 786432 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 926365 data_alloc: 218103808 data_used: 135168
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80879616 unmapped: 786432 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80879616 unmapped: 786432 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80887808 unmapped: 778240 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80887808 unmapped: 778240 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80904192 unmapped: 761856 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 925774 data_alloc: 218103808 data_used: 135168
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 81952768 unmapped: 761856 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 81952768 unmapped: 761856 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 600.1 total, 600.0 interval#012Cumulative writes: 7179 writes, 30K keys, 7179 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.03 MB/s#012Cumulative WAL: 7179 writes, 1333 syncs, 5.39 writes per sync, written: 0.02 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 7179 writes, 30K keys, 7179 commit groups, 1.0 writes per commit group, ingest: 20.58 MB, 0.03 MB/s#012Interval WAL: 7179 writes, 1333 syncs, 5.39 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.01              0.00         1    0.013       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.01              0.00         1    0.013       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.1      0.01              0.00         1    0.013       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55fb227cf350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 0.000158 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55fb227cf350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 0.000158 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtab
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82034688 unmapped: 679936 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.904463768s of 12.073836327s, submitted: 11
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82034688 unmapped: 679936 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82034688 unmapped: 679936 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 925167 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82042880 unmapped: 671744 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82042880 unmapped: 671744 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82051072 unmapped: 663552 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82067456 unmapped: 647168 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82067456 unmapped: 647168 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 925035 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82075648 unmapped: 638976 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82075648 unmapped: 638976 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82083840 unmapped: 630784 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82083840 unmapped: 630784 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82092032 unmapped: 622592 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 925035 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82100224 unmapped: 614400 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82100224 unmapped: 614400 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82108416 unmapped: 606208 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82108416 unmapped: 606208 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82108416 unmapped: 606208 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 925035 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82116608 unmapped: 598016 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82116608 unmapped: 598016 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82124800 unmapped: 589824 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82124800 unmapped: 589824 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82124800 unmapped: 589824 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 925035 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82132992 unmapped: 581632 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82132992 unmapped: 581632 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82141184 unmapped: 573440 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82141184 unmapped: 573440 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82149376 unmapped: 565248 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 925035 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82157568 unmapped: 557056 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82157568 unmapped: 557056 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82165760 unmapped: 548864 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82165760 unmapped: 548864 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82165760 unmapped: 548864 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 925035 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82173952 unmapped: 540672 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82173952 unmapped: 540672 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82182144 unmapped: 532480 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82182144 unmapped: 532480 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 ms_handle_reset con 0x55fb25ef5400 session 0x55fb240be000
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82190336 unmapped: 524288 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 925035 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82190336 unmapped: 524288 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82190336 unmapped: 524288 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82190336 unmapped: 524288 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82198528 unmapped: 516096 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82198528 unmapped: 516096 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 925035 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82206720 unmapped: 507904 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82206720 unmapped: 507904 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82214912 unmapped: 499712 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82214912 unmapped: 499712 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82223104 unmapped: 491520 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 925035 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 47.327915192s of 47.335025787s, submitted: 2
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82206720 unmapped: 507904 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82206720 unmapped: 507904 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82206720 unmapped: 507904 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82223104 unmapped: 491520 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82239488 unmapped: 475136 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 925183 data_alloc: 218103808 data_used: 135168
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82239488 unmapped: 475136 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82247680 unmapped: 466944 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82255872 unmapped: 458752 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82255872 unmapped: 458752 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82255872 unmapped: 458752 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 925183 data_alloc: 218103808 data_used: 135168
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82264064 unmapped: 450560 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82264064 unmapped: 450560 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82272256 unmapped: 442368 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82272256 unmapped: 442368 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82280448 unmapped: 434176 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 925183 data_alloc: 218103808 data_used: 135168
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 15.530076981s of 15.585687637s, submitted: 10
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82288640 unmapped: 425984 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82288640 unmapped: 425984 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82296832 unmapped: 417792 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82296832 unmapped: 417792 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82313216 unmapped: 401408 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 925035 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82313216 unmapped: 401408 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82321408 unmapped: 393216 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82321408 unmapped: 393216 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82321408 unmapped: 393216 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82329600 unmapped: 385024 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 925035 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82329600 unmapped: 385024 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82329600 unmapped: 385024 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82337792 unmapped: 376832 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82337792 unmapped: 376832 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82345984 unmapped: 368640 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 925035 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82345984 unmapped: 368640 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82354176 unmapped: 360448 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82354176 unmapped: 360448 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82354176 unmapped: 360448 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82362368 unmapped: 352256 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 925035 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82362368 unmapped: 352256 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82370560 unmapped: 344064 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82370560 unmapped: 344064 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82378752 unmapped: 335872 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82378752 unmapped: 335872 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 925035 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82378752 unmapped: 335872 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82386944 unmapped: 327680 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82386944 unmapped: 327680 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82386944 unmapped: 327680 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82395136 unmapped: 319488 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 925035 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82395136 unmapped: 319488 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82395136 unmapped: 319488 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82403328 unmapped: 311296 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82403328 unmapped: 311296 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82411520 unmapped: 303104 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 925035 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82411520 unmapped: 303104 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82411520 unmapped: 303104 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82419712 unmapped: 294912 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82419712 unmapped: 294912 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82427904 unmapped: 286720 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 925035 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82427904 unmapped: 286720 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82436096 unmapped: 278528 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82436096 unmapped: 278528 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82436096 unmapped: 278528 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82444288 unmapped: 270336 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 925035 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82444288 unmapped: 270336 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82444288 unmapped: 270336 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82452480 unmapped: 262144 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82460672 unmapped: 253952 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 245760 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 925035 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 245760 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 237568 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 237568 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 237568 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82485248 unmapped: 229376 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 925035 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82485248 unmapped: 229376 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82493440 unmapped: 221184 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82493440 unmapped: 221184 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82493440 unmapped: 221184 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82501632 unmapped: 212992 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 925035 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82501632 unmapped: 212992 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82501632 unmapped: 212992 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82501632 unmapped: 212992 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82509824 unmapped: 204800 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82509824 unmapped: 204800 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 925035 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82509824 unmapped: 204800 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82509824 unmapped: 204800 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82509824 unmapped: 204800 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82509824 unmapped: 204800 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82509824 unmapped: 204800 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 925035 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82509824 unmapped: 204800 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82509824 unmapped: 204800 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82509824 unmapped: 204800 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82509824 unmapped: 204800 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82509824 unmapped: 204800 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 925035 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82509824 unmapped: 204800 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82509824 unmapped: 204800 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82509824 unmapped: 204800 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82509824 unmapped: 204800 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82509824 unmapped: 204800 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 925035 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82509824 unmapped: 204800 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 80.756385803s of 80.760154724s, submitted: 1
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82599936 unmapped: 1163264 heap: 83763200 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82616320 unmapped: 1146880 heap: 83763200 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82616320 unmapped: 1146880 heap: 83763200 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 ms_handle_reset con 0x55fb24bd3000 session 0x55fb2719d860
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82919424 unmapped: 843776 heap: 83763200 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 925035 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82919424 unmapped: 843776 heap: 83763200 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82927616 unmapped: 835584 heap: 83763200 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82927616 unmapped: 835584 heap: 83763200 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82927616 unmapped: 835584 heap: 83763200 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82927616 unmapped: 835584 heap: 83763200 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 925035 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82927616 unmapped: 835584 heap: 83763200 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82927616 unmapped: 835584 heap: 83763200 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82935808 unmapped: 827392 heap: 83763200 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82935808 unmapped: 827392 heap: 83763200 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82935808 unmapped: 827392 heap: 83763200 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 925035 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 13.854737282s of 14.505135536s, submitted: 221
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82927616 unmapped: 835584 heap: 83763200 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82927616 unmapped: 835584 heap: 83763200 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82927616 unmapped: 835584 heap: 83763200 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82927616 unmapped: 835584 heap: 83763200 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82944000 unmapped: 819200 heap: 83763200 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 925183 data_alloc: 218103808 data_used: 135168
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82944000 unmapped: 819200 heap: 83763200 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82952192 unmapped: 811008 heap: 83763200 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84033536 unmapped: 778240 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84033536 unmapped: 778240 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82984960 unmapped: 1826816 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 926695 data_alloc: 218103808 data_used: 135168
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82984960 unmapped: 1826816 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82984960 unmapped: 1826816 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82993152 unmapped: 1818624 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83001344 unmapped: 1810432 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 14.295805931s of 14.345085144s, submitted: 11
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82993152 unmapped: 1818624 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 926395 data_alloc: 218103808 data_used: 135168
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82993152 unmapped: 1818624 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82993152 unmapped: 1818624 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82993152 unmapped: 1818624 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83001344 unmapped: 1810432 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83001344 unmapped: 1810432 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 926547 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83001344 unmapped: 1810432 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83001344 unmapped: 1810432 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83001344 unmapped: 1810432 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83001344 unmapped: 1810432 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83001344 unmapped: 1810432 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 926547 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83001344 unmapped: 1810432 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83001344 unmapped: 1810432 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83009536 unmapped: 1802240 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83009536 unmapped: 1802240 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83009536 unmapped: 1802240 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 926547 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83017728 unmapped: 1794048 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83017728 unmapped: 1794048 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83025920 unmapped: 1785856 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83025920 unmapped: 1785856 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83025920 unmapped: 1785856 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 926547 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83025920 unmapped: 1785856 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83025920 unmapped: 1785856 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83025920 unmapped: 1785856 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83025920 unmapped: 1785856 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83025920 unmapped: 1785856 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 926547 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83017728 unmapped: 1794048 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83017728 unmapped: 1794048 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83017728 unmapped: 1794048 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83017728 unmapped: 1794048 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83017728 unmapped: 1794048 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 926547 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83017728 unmapped: 1794048 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83017728 unmapped: 1794048 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83017728 unmapped: 1794048 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83017728 unmapped: 1794048 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83017728 unmapped: 1794048 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 926547 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83017728 unmapped: 1794048 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83017728 unmapped: 1794048 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83017728 unmapped: 1794048 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83017728 unmapped: 1794048 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83017728 unmapped: 1794048 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 926547 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83017728 unmapped: 1794048 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83017728 unmapped: 1794048 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83017728 unmapped: 1794048 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83025920 unmapped: 1785856 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83025920 unmapped: 1785856 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 926547 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83025920 unmapped: 1785856 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83025920 unmapped: 1785856 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83025920 unmapped: 1785856 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83025920 unmapped: 1785856 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83025920 unmapped: 1785856 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 926547 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83025920 unmapped: 1785856 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83025920 unmapped: 1785856 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83025920 unmapped: 1785856 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 ms_handle_reset con 0x55fb235d5c00 session 0x55fb26ae6780
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83025920 unmapped: 1785856 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83025920 unmapped: 1785856 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 926547 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83025920 unmapped: 1785856 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83025920 unmapped: 1785856 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83025920 unmapped: 1785856 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83025920 unmapped: 1785856 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83025920 unmapped: 1785856 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 926547 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83025920 unmapped: 1785856 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83025920 unmapped: 1785856 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83025920 unmapped: 1785856 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 64.185989380s of 64.189659119s, submitted: 1
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83034112 unmapped: 1777664 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83034112 unmapped: 1777664 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 926679 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83034112 unmapped: 1777664 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83034112 unmapped: 1777664 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83034112 unmapped: 1777664 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83034112 unmapped: 1777664 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83050496 unmapped: 1761280 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 928207 data_alloc: 218103808 data_used: 135168
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83050496 unmapped: 1761280 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83050496 unmapped: 1761280 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83066880 unmapped: 1744896 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83066880 unmapped: 1744896 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83066880 unmapped: 1744896 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.424573898s of 11.459362030s, submitted: 10
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 927907 data_alloc: 218103808 data_used: 135168
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83075072 unmapped: 1736704 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83075072 unmapped: 1736704 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83075072 unmapped: 1736704 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83075072 unmapped: 1736704 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83075072 unmapped: 1736704 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 927468 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83075072 unmapped: 1736704 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83075072 unmapped: 1736704 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83075072 unmapped: 1736704 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83075072 unmapped: 1736704 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83075072 unmapped: 1736704 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 927468 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83075072 unmapped: 1736704 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83075072 unmapped: 1736704 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83075072 unmapped: 1736704 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83075072 unmapped: 1736704 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83075072 unmapped: 1736704 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 927468 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83075072 unmapped: 1736704 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83075072 unmapped: 1736704 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83075072 unmapped: 1736704 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83075072 unmapped: 1736704 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83075072 unmapped: 1736704 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 927468 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83075072 unmapped: 1736704 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83075072 unmapped: 1736704 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83075072 unmapped: 1736704 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83075072 unmapped: 1736704 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83075072 unmapped: 1736704 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 927468 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83075072 unmapped: 1736704 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83075072 unmapped: 1736704 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83075072 unmapped: 1736704 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83075072 unmapped: 1736704 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83075072 unmapped: 1736704 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 927468 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83075072 unmapped: 1736704 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83075072 unmapped: 1736704 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83075072 unmapped: 1736704 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83075072 unmapped: 1736704 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83075072 unmapped: 1736704 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 ms_handle_reset con 0x55fb25ef5800 session 0x55fb25e2b0e0
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 927468 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83075072 unmapped: 1736704 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83075072 unmapped: 1736704 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83075072 unmapped: 1736704 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83075072 unmapped: 1736704 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83075072 unmapped: 1736704 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 927468 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83075072 unmapped: 1736704 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83075072 unmapped: 1736704 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83075072 unmapped: 1736704 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83075072 unmapped: 1736704 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83075072 unmapped: 1736704 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 927468 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83075072 unmapped: 1736704 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 46.125885010s of 46.133140564s, submitted: 2
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83083264 unmapped: 1728512 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83083264 unmapped: 1728512 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83083264 unmapped: 1728512 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 929128 data_alloc: 218103808 data_used: 135168
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 928369 data_alloc: 218103808 data_used: 135168
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.988737106s of 13.028569221s, submitted: 12
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 928389 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 928389 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 928389 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 928389 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 ms_handle_reset con 0x55fb2417d400 session 0x55fb252a5680
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 928389 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 928389 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 27.333505630s of 27.336708069s, submitted: 1
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83124224 unmapped: 1687552 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83124224 unmapped: 1687552 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [0,2])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83148800 unmapped: 1662976 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930049 data_alloc: 218103808 data_used: 135168
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83148800 unmapped: 1662976 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83148800 unmapped: 1662976 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83165184 unmapped: 1646592 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83165184 unmapped: 1646592 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83165184 unmapped: 1646592 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930049 data_alloc: 218103808 data_used: 135168
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83165184 unmapped: 1646592 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83165184 unmapped: 1646592 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83165184 unmapped: 1646592 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83165184 unmapped: 1646592 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.960668564s of 13.129460335s, submitted: 12
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83173376 unmapped: 1638400 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 929310 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83173376 unmapped: 1638400 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83173376 unmapped: 1638400 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83173376 unmapped: 1638400 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83173376 unmapped: 1638400 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83173376 unmapped: 1638400 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 929310 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83173376 unmapped: 1638400 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83173376 unmapped: 1638400 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 929310 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 929310 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 929310 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 929310 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 929310 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 929310 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 ms_handle_reset con 0x55fb2417cc00 session 0x55fb271bda40
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 929310 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 929310 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 48.355422974s of 48.445949554s, submitted: 1
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83222528 unmapped: 1589248 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83222528 unmapped: 1589248 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83222528 unmapped: 1589248 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 929442 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 ms_handle_reset con 0x55fb25ef5400 session 0x55fb26d65860
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83222528 unmapped: 1589248 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83222528 unmapped: 1589248 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83222528 unmapped: 1589248 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83222528 unmapped: 1589248 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83222528 unmapped: 1589248 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930970 data_alloc: 218103808 data_used: 135168
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83222528 unmapped: 1589248 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83222528 unmapped: 1589248 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83222528 unmapped: 1589248 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83222528 unmapped: 1589248 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83222528 unmapped: 1589248 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930970 data_alloc: 218103808 data_used: 135168
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83222528 unmapped: 1589248 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 14.050199509s of 14.091516495s, submitted: 5
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83222528 unmapped: 1589248 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83279872 unmapped: 1531904 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83279872 unmapped: 1531904 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83296256 unmapped: 1515520 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930379 data_alloc: 218103808 data_used: 135168
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83296256 unmapped: 1515520 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83296256 unmapped: 1515520 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83304448 unmapped: 1507328 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83304448 unmapped: 1507328 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83304448 unmapped: 1507328 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 931891 data_alloc: 218103808 data_used: 135168
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83304448 unmapped: 1507328 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83304448 unmapped: 1507328 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83304448 unmapped: 1507328 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.031768799s of 12.082296371s, submitted: 15
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83329024 unmapped: 1482752 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83329024 unmapped: 1482752 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 931284 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83345408 unmapped: 1466368 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83345408 unmapped: 1466368 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83345408 unmapped: 1466368 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83345408 unmapped: 1466368 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83345408 unmapped: 1466368 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 931152 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83345408 unmapped: 1466368 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83345408 unmapped: 1466368 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83361792 unmapped: 1449984 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83361792 unmapped: 1449984 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83361792 unmapped: 1449984 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 931152 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83361792 unmapped: 1449984 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83361792 unmapped: 1449984 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83361792 unmapped: 1449984 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83361792 unmapped: 1449984 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83361792 unmapped: 1449984 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 931152 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83361792 unmapped: 1449984 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83361792 unmapped: 1449984 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83361792 unmapped: 1449984 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83361792 unmapped: 1449984 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83361792 unmapped: 1449984 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 931152 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83361792 unmapped: 1449984 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83361792 unmapped: 1449984 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83361792 unmapped: 1449984 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83361792 unmapped: 1449984 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83361792 unmapped: 1449984 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 931152 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83361792 unmapped: 1449984 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83361792 unmapped: 1449984 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 931152 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 931152 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 931152 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 931152 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 931152 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 ms_handle_reset con 0x55fb235d5c00 session 0x55fb26f585a0
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 931152 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 931152 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 66.122673035s of 66.133468628s, submitted: 3
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 931284 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83402752 unmapped: 1409024 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83402752 unmapped: 1409024 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83402752 unmapped: 1409024 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 931300 data_alloc: 218103808 data_used: 135168
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83402752 unmapped: 1409024 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83402752 unmapped: 1409024 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83402752 unmapped: 1409024 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83402752 unmapped: 1409024 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83402752 unmapped: 1409024 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 931300 data_alloc: 218103808 data_used: 135168
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83402752 unmapped: 1409024 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.004903793s of 12.174333572s, submitted: 10
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83402752 unmapped: 1409024 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83402752 unmapped: 1409024 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83402752 unmapped: 1409024 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83402752 unmapped: 1409024 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930693 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83402752 unmapped: 1409024 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930561 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930561 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930561 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930561 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930561 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930561 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930561 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930561 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930561 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930561 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930561 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930561 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930561 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930561 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930561 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930561 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930561 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930561 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1200.1 total, 600.0 interval#012Cumulative writes: 7804 writes, 31K keys, 7804 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.02 MB/s#012Cumulative WAL: 7804 writes, 1639 syncs, 4.76 writes per sync, written: 0.02 GB, 0.02 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 625 writes, 1051 keys, 625 commit groups, 1.0 writes per commit group, ingest: 0.41 MB, 0.00 MB/s#012Interval WAL: 625 writes, 306 syncs, 2.04 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.01              0.00         1    0.013       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.01              0.00         1    0.013       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.1      0.01              0.00         1    0.013       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55fb227cf350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.8e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55fb227cf350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.8e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930561 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930561 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 ms_handle_reset con 0x55fb24bd1c00 session 0x55fb26b301e0
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930561 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930561 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930561 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 119.449485779s of 119.461830139s, submitted: 2
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932221 data_alloc: 218103808 data_used: 135168
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83443712 unmapped: 1368064 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83443712 unmapped: 1368064 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83443712 unmapped: 1368064 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83443712 unmapped: 1368064 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932221 data_alloc: 218103808 data_used: 135168
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83443712 unmapped: 1368064 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83443712 unmapped: 1368064 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83443712 unmapped: 1368064 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83443712 unmapped: 1368064 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 14.086344719s of 14.123706818s, submitted: 11
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83443712 unmapped: 1368064 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932089 data_alloc: 218103808 data_used: 135168
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83443712 unmapped: 1368064 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83443712 unmapped: 1368064 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83443712 unmapped: 1368064 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83443712 unmapped: 1368064 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83443712 unmapped: 1368064 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932073 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83443712 unmapped: 1368064 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83443712 unmapped: 1368064 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83443712 unmapped: 1368064 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83443712 unmapped: 1368064 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83443712 unmapped: 1368064 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932073 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932073 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread fragmentation_score=0.000028 took=0.000254s
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932073 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932073 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932073 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932073 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932073 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932073 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932073 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83460096 unmapped: 1351680 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932073 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83460096 unmapped: 1351680 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83460096 unmapped: 1351680 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83460096 unmapped: 1351680 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83460096 unmapped: 1351680 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83460096 unmapped: 1351680 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932073 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83460096 unmapped: 1351680 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83460096 unmapped: 1351680 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83460096 unmapped: 1351680 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83460096 unmapped: 1351680 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83460096 unmapped: 1351680 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932073 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83460096 unmapped: 1351680 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83460096 unmapped: 1351680 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83460096 unmapped: 1351680 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83460096 unmapped: 1351680 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83460096 unmapped: 1351680 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932073 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83460096 unmapped: 1351680 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83460096 unmapped: 1351680 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83460096 unmapped: 1351680 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83460096 unmapped: 1351680 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83460096 unmapped: 1351680 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932073 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83460096 unmapped: 1351680 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83460096 unmapped: 1351680 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83460096 unmapped: 1351680 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83460096 unmapped: 1351680 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 ms_handle_reset con 0x55fb2417d400 session 0x55fb26f51a40
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83460096 unmapped: 1351680 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932073 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83460096 unmapped: 1351680 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83468288 unmapped: 1343488 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83468288 unmapped: 1343488 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83468288 unmapped: 1343488 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83468288 unmapped: 1343488 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932073 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83468288 unmapped: 1343488 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83468288 unmapped: 1343488 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83468288 unmapped: 1343488 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83468288 unmapped: 1343488 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83468288 unmapped: 1343488 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932073 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 90.763069153s of 91.630355835s, submitted: 1
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83476480 unmapped: 1335296 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83476480 unmapped: 1335296 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83476480 unmapped: 1335296 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83476480 unmapped: 1335296 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83476480 unmapped: 1335296 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932221 data_alloc: 218103808 data_used: 135168
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83476480 unmapped: 1335296 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83484672 unmapped: 1327104 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83484672 unmapped: 1327104 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83484672 unmapped: 1327104 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83484672 unmapped: 1327104 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932974 data_alloc: 218103808 data_used: 135168
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83484672 unmapped: 1327104 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83484672 unmapped: 1327104 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83484672 unmapped: 1327104 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83484672 unmapped: 1327104 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 14.195486069s of 14.243807793s, submitted: 12
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83492864 unmapped: 1318912 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932994 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83492864 unmapped: 1318912 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83492864 unmapped: 1318912 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83492864 unmapped: 1318912 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83615744 unmapped: 1196032 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [0,1])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83877888 unmapped: 933888 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 podman[240100]: 2025-12-06 10:17:30.770478406 +0000 UTC m=+0.059075642 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932994 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83877888 unmapped: 933888 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83877888 unmapped: 933888 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83877888 unmapped: 933888 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83886080 unmapped: 925696 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83886080 unmapped: 925696 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932994 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83886080 unmapped: 925696 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83886080 unmapped: 925696 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83886080 unmapped: 925696 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83894272 unmapped: 917504 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83894272 unmapped: 917504 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932994 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83894272 unmapped: 917504 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83894272 unmapped: 917504 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83894272 unmapped: 917504 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83894272 unmapped: 917504 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83894272 unmapped: 917504 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932994 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83902464 unmapped: 909312 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83902464 unmapped: 909312 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83902464 unmapped: 909312 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83902464 unmapped: 909312 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83902464 unmapped: 909312 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932994 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83902464 unmapped: 909312 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83902464 unmapped: 909312 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83910656 unmapped: 901120 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83910656 unmapped: 901120 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83910656 unmapped: 901120 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932994 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83910656 unmapped: 901120 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83910656 unmapped: 901120 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83910656 unmapped: 901120 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83910656 unmapped: 901120 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83910656 unmapped: 901120 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932994 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83910656 unmapped: 901120 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83910656 unmapped: 901120 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83918848 unmapped: 892928 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83918848 unmapped: 892928 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83918848 unmapped: 892928 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932994 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83918848 unmapped: 892928 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83918848 unmapped: 892928 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83927040 unmapped: 884736 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83927040 unmapped: 884736 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83927040 unmapped: 884736 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932994 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83927040 unmapped: 884736 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83927040 unmapped: 884736 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 ms_handle_reset con 0x55fb2417cc00 session 0x55fb268f6000
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83927040 unmapped: 884736 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83927040 unmapped: 884736 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83927040 unmapped: 884736 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932994 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83927040 unmapped: 884736 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83927040 unmapped: 884736 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83927040 unmapped: 884736 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83927040 unmapped: 884736 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83927040 unmapped: 884736 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932994 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83927040 unmapped: 884736 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83927040 unmapped: 884736 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83927040 unmapped: 884736 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 57.946811676s of 58.467189789s, submitted: 205
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83927040 unmapped: 884736 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83927040 unmapped: 884736 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 933126 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83927040 unmapped: 884736 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83927040 unmapped: 884736 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83927040 unmapped: 884736 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83927040 unmapped: 884736 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 936166 data_alloc: 218103808 data_used: 135168
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935407 data_alloc: 218103808 data_used: 135168
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 14.869669914s of 14.905448914s, submitted: 12
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935427 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935427 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935427 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935427 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935427 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935427 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935427 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935427 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935427 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935427 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935427 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935427 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935427 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935427 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935427 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935427 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935427 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83943424 unmapped: 868352 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935427 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83943424 unmapped: 868352 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83943424 unmapped: 868352 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83943424 unmapped: 868352 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83943424 unmapped: 868352 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83943424 unmapped: 868352 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935427 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 146 handle_osd_map epochs [147,147], i have 146, src has [1,147]
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 93.416572571s of 93.421234131s, submitted: 1
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83943424 unmapped: 868352 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83943424 unmapped: 868352 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 147 handle_osd_map epochs [148,148], i have 147, src has [1,148]
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 148 heartbeat osd_stat(store_statfs(0x4fc1e1000/0x0/0x4ffc00000, data 0x574248/0x62a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83984384 unmapped: 10141696 heap: 94126080 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 148 handle_osd_map epochs [148,149], i have 148, src has [1,149]
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 149 ms_handle_reset con 0x55fb26b43400 session 0x55fb26f9b2c0
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84090880 unmapped: 18432000 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84131840 unmapped: 18391040 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1056837 data_alloc: 218103808 data_used: 143360
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 149 handle_osd_map epochs [149,150], i have 149, src has [1,150]
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 150 ms_handle_reset con 0x55fb26b43400 session 0x55fb26c185a0
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 150 heartbeat osd_stat(store_statfs(0x4fb56a000/0x0/0x4ffc00000, data 0x11e847b/0x12a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84131840 unmapped: 18391040 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84131840 unmapped: 18391040 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84131840 unmapped: 18391040 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 150 heartbeat osd_stat(store_statfs(0x4fb56a000/0x0/0x4ffc00000, data 0x11e847b/0x12a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84131840 unmapped: 18391040 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84131840 unmapped: 18391040 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1062395 data_alloc: 218103808 data_used: 143360
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 150 handle_osd_map epochs [151,151], i have 150, src has [1,151]
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84140032 unmapped: 18382848 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 151 heartbeat osd_stat(store_statfs(0x4fb567000/0x0/0x4ffc00000, data 0x11ea44d/0x12a4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84140032 unmapped: 18382848 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 151 heartbeat osd_stat(store_statfs(0x4fb567000/0x0/0x4ffc00000, data 0x11ea44d/0x12a4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84115456 unmapped: 18407424 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84115456 unmapped: 18407424 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 151 heartbeat osd_stat(store_statfs(0x4fb567000/0x0/0x4ffc00000, data 0x11ea44d/0x12a4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84123648 unmapped: 18399232 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1065153 data_alloc: 218103808 data_used: 143360
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84123648 unmapped: 18399232 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84123648 unmapped: 18399232 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 151 heartbeat osd_stat(store_statfs(0x4fb567000/0x0/0x4ffc00000, data 0x11ea44d/0x12a4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84123648 unmapped: 18399232 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84123648 unmapped: 18399232 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 151 heartbeat osd_stat(store_statfs(0x4fb567000/0x0/0x4ffc00000, data 0x11ea44d/0x12a4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84123648 unmapped: 18399232 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1065153 data_alloc: 218103808 data_used: 143360
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 151 heartbeat osd_stat(store_statfs(0x4fb567000/0x0/0x4ffc00000, data 0x11ea44d/0x12a4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84123648 unmapped: 18399232 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 151 heartbeat osd_stat(store_statfs(0x4fb567000/0x0/0x4ffc00000, data 0x11ea44d/0x12a4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84123648 unmapped: 18399232 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84123648 unmapped: 18399232 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 151 ms_handle_reset con 0x55fb24bd2400 session 0x55fb26acb4a0
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84123648 unmapped: 18399232 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84123648 unmapped: 18399232 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1065153 data_alloc: 218103808 data_used: 143360
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84123648 unmapped: 18399232 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84123648 unmapped: 18399232 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 151 heartbeat osd_stat(store_statfs(0x4fb567000/0x0/0x4ffc00000, data 0x11ea44d/0x12a4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 151 ms_handle_reset con 0x55fb26b42c00 session 0x55fb26db8d20
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84123648 unmapped: 18399232 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84123648 unmapped: 18399232 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84123648 unmapped: 18399232 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1065153 data_alloc: 218103808 data_used: 143360
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84123648 unmapped: 18399232 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84123648 unmapped: 18399232 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 151 heartbeat osd_stat(store_statfs(0x4fb567000/0x0/0x4ffc00000, data 0x11ea44d/0x12a4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84123648 unmapped: 18399232 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84123648 unmapped: 18399232 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 33.286060333s of 33.423255920s, submitted: 40
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84123648 unmapped: 18399232 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1064445 data_alloc: 218103808 data_used: 143360
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84131840 unmapped: 18391040 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84131840 unmapped: 18391040 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 151 heartbeat osd_stat(store_statfs(0x4fb568000/0x0/0x4ffc00000, data 0x11ea44d/0x12a4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84131840 unmapped: 18391040 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 151 heartbeat osd_stat(store_statfs(0x4fb568000/0x0/0x4ffc00000, data 0x11ea44d/0x12a4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84131840 unmapped: 18391040 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84131840 unmapped: 18391040 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1064593 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84172800 unmapped: 18350080 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84172800 unmapped: 18350080 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 151 heartbeat osd_stat(store_statfs(0x4fb568000/0x0/0x4ffc00000, data 0x11ea44d/0x12a4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84172800 unmapped: 18350080 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 151 heartbeat osd_stat(store_statfs(0x4fb568000/0x0/0x4ffc00000, data 0x11ea44d/0x12a4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84131840 unmapped: 18391040 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84131840 unmapped: 18391040 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 151 heartbeat osd_stat(store_statfs(0x4fb568000/0x0/0x4ffc00000, data 0x11ea44d/0x12a4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1063986 data_alloc: 218103808 data_used: 143360
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84131840 unmapped: 18391040 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.490637779s of 12.528245926s, submitted: 11
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84140032 unmapped: 18382848 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 151 heartbeat osd_stat(store_statfs(0x4fb568000/0x0/0x4ffc00000, data 0x11ea44d/0x12a4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84140032 unmapped: 18382848 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84140032 unmapped: 18382848 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84140032 unmapped: 18382848 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1063702 data_alloc: 218103808 data_used: 139264
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84140032 unmapped: 18382848 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 151 ms_handle_reset con 0x55fb2417d400 session 0x55fb268d2b40
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 88793088 unmapped: 13729792 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 151 handle_osd_map epochs [151,152], i have 151, src has [1,152]
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 89841664 unmapped: 12681216 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 152 heartbeat osd_stat(store_statfs(0x4fb568000/0x0/0x4ffc00000, data 0x11ea44d/0x12a4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 152 handle_osd_map epochs [153,153], i have 152, src has [1,153]
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 90734592 unmapped: 11788288 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 153 ms_handle_reset con 0x55fb24bd1c00 session 0x55fb240bc3c0
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 90210304 unmapped: 12312576 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1113326 data_alloc: 218103808 data_used: 4796416
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 90243072 unmapped: 12279808 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 90243072 unmapped: 12279808 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 153 heartbeat osd_stat(store_statfs(0x4fb1bc000/0x0/0x4ffc00000, data 0x1592679/0x164e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.364974976s of 10.899864197s, submitted: 40
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 153 ms_handle_reset con 0x55fb24bd1c00 session 0x55fb24f02b40
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 90275840 unmapped: 12247040 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 90275840 unmapped: 12247040 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 93954048 unmapped: 8568832 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1138423 data_alloc: 218103808 data_used: 8523776
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 93954048 unmapped: 8568832 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 153 handle_osd_map epochs [154,154], i have 153, src has [1,154]
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 93954048 unmapped: 8568832 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fb1b9000/0x0/0x4ffc00000, data 0x159466e/0x1652000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 93954048 unmapped: 8568832 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 93954048 unmapped: 8568832 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb235d5c00 session 0x55fb268d3e00
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 93954048 unmapped: 8568832 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1142141 data_alloc: 218103808 data_used: 8527872
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 93954048 unmapped: 8568832 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 93954048 unmapped: 8568832 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fb1b9000/0x0/0x4ffc00000, data 0x159466e/0x1652000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 93954048 unmapped: 8568832 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 93954048 unmapped: 8568832 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fb1b9000/0x0/0x4ffc00000, data 0x159466e/0x1652000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 93954048 unmapped: 8568832 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1142141 data_alloc: 218103808 data_used: 8527872
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.835918427s of 12.857731819s, submitted: 18
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fb1b9000/0x0/0x4ffc00000, data 0x159466e/0x1652000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [0,0,1])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 96157696 unmapped: 6365184 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fad9d000/0x0/0x4ffc00000, data 0x19a366e/0x1a61000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #43. Immutable memtables: 0.
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 99008512 unmapped: 3514368 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 99008512 unmapped: 3514368 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9be3000/0x0/0x4ffc00000, data 0x19b566e/0x1a73000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 99008512 unmapped: 3514368 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1175489 data_alloc: 218103808 data_used: 8544256
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 98172928 unmapped: 4349952 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 98172928 unmapped: 4349952 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 98172928 unmapped: 4349952 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 98172928 unmapped: 4349952 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 98172928 unmapped: 4349952 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9bf9000/0x0/0x4ffc00000, data 0x19b566e/0x1a73000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1175673 data_alloc: 218103808 data_used: 8540160
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 98181120 unmapped: 4341760 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.407831192s of 10.563117027s, submitted: 55
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 98181120 unmapped: 4341760 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 98181120 unmapped: 4341760 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 98181120 unmapped: 4341760 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 98181120 unmapped: 4341760 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9bf9000/0x0/0x4ffc00000, data 0x19b566e/0x1a73000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1175505 data_alloc: 218103808 data_used: 8540160
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 98181120 unmapped: 4341760 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 98181120 unmapped: 4341760 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 98181120 unmapped: 4341760 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9bf9000/0x0/0x4ffc00000, data 0x19b566e/0x1a73000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 98181120 unmapped: 4341760 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 98181120 unmapped: 4341760 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1175205 data_alloc: 218103808 data_used: 8540160
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 98181120 unmapped: 4341760 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 98181120 unmapped: 4341760 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb26b43400 session 0x55fb271ad0e0
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 98181120 unmapped: 4341760 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9bf9000/0x0/0x4ffc00000, data 0x19b566e/0x1a73000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.870066643s of 11.887675285s, submitted: 5
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb26b43800 session 0x55fb271ac000
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 97779712 unmapped: 13139968 heap: 110919680 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 97779712 unmapped: 13139968 heap: 110919680 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f92da000/0x0/0x4ffc00000, data 0x22d466e/0x2392000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1243431 data_alloc: 218103808 data_used: 8544256
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 97845248 unmapped: 13074432 heap: 110919680 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 97845248 unmapped: 13074432 heap: 110919680 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 97845248 unmapped: 13074432 heap: 110919680 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb26b43c00 session 0x55fb26a99e00
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 97845248 unmapped: 13074432 heap: 110919680 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f92da000/0x0/0x4ffc00000, data 0x22d466e/0x2392000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 97845248 unmapped: 13074432 heap: 110919680 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb235d5c00 session 0x55fb23791e00
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1243431 data_alloc: 218103808 data_used: 8544256
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 97861632 unmapped: 13058048 heap: 110919680 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb24bd1c00 session 0x55fb26aca1e0
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb26b43400 session 0x55fb2422b860
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 97607680 unmapped: 13312000 heap: 110919680 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f92b6000/0x0/0x4ffc00000, data 0x22f866e/0x23b6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f92b6000/0x0/0x4ffc00000, data 0x22f866e/0x23b6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 97607680 unmapped: 13312000 heap: 110919680 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 101277696 unmapped: 9641984 heap: 110919680 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105553920 unmapped: 5365760 heap: 110919680 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f92b4000/0x0/0x4ffc00000, data 0x22f966e/0x23b7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1311187 data_alloc: 234881024 data_used: 17436672
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105586688 unmapped: 5332992 heap: 110919680 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105586688 unmapped: 5332992 heap: 110919680 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105586688 unmapped: 5332992 heap: 110919680 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105586688 unmapped: 5332992 heap: 110919680 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105594880 unmapped: 5324800 heap: 110919680 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f92b4000/0x0/0x4ffc00000, data 0x22f966e/0x23b7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1311187 data_alloc: 234881024 data_used: 17436672
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105594880 unmapped: 5324800 heap: 110919680 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f92b4000/0x0/0x4ffc00000, data 0x22f966e/0x23b7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f92b4000/0x0/0x4ffc00000, data 0x22f966e/0x23b7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105594880 unmapped: 5324800 heap: 110919680 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105594880 unmapped: 5324800 heap: 110919680 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb26b42c00 session 0x55fb24bb70e0
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105594880 unmapped: 5324800 heap: 110919680 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 20.669349670s of 20.780309677s, submitted: 12
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109428736 unmapped: 3588096 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1383089 data_alloc: 234881024 data_used: 18948096
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110968832 unmapped: 2048000 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110968832 unmapped: 2048000 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f8aff000/0x0/0x4ffc00000, data 0x2aaf66e/0x2b6d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110968832 unmapped: 2048000 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110968832 unmapped: 2048000 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110968832 unmapped: 2048000 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f8aff000/0x0/0x4ffc00000, data 0x2aaf66e/0x2b6d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1383089 data_alloc: 234881024 data_used: 18948096
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 111001600 unmapped: 2015232 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f8aff000/0x0/0x4ffc00000, data 0x2aaf66e/0x2b6d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 111001600 unmapped: 2015232 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 111001600 unmapped: 2015232 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 111001600 unmapped: 2015232 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 111001600 unmapped: 2015232 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb26b42400 session 0x55fb26d654a0
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb26b43800 session 0x55fb26da5c20
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.124892235s of 11.262226105s, submitted: 63
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb25c3c400 session 0x55fb26a990e0
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1183609 data_alloc: 218103808 data_used: 7954432
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 103227392 unmapped: 9789440 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 103227392 unmapped: 9789440 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9bf8000/0x0/0x4ffc00000, data 0x19b666e/0x1a74000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2417cc00 session 0x55fb26a96f00
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9bf8000/0x0/0x4ffc00000, data 0x19b666e/0x1a74000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 103227392 unmapped: 9789440 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 103227392 unmapped: 9789440 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 103227392 unmapped: 9789440 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9bf8000/0x0/0x4ffc00000, data 0x19b666e/0x1a74000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1183625 data_alloc: 218103808 data_used: 7950336
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 103227392 unmapped: 9789440 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2417d400 session 0x55fb26da4780
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb23f50000 session 0x55fb26f39c20
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100892672 unmapped: 12124160 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3be000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100892672 unmapped: 12124160 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3be000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100892672 unmapped: 12124160 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3be000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3be000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100892672 unmapped: 12124160 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1099353 data_alloc: 218103808 data_used: 4796416
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100892672 unmapped: 12124160 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100892672 unmapped: 12124160 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100892672 unmapped: 12124160 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3be000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.781655312s of 12.873902321s, submitted: 32
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100900864 unmapped: 12115968 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3bf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3bf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100900864 unmapped: 12115968 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1099077 data_alloc: 218103808 data_used: 4792320
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100909056 unmapped: 12107776 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100917248 unmapped: 12099584 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3bf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100917248 unmapped: 12099584 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3bf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100917248 unmapped: 12099584 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100917248 unmapped: 12099584 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1100589 data_alloc: 218103808 data_used: 4792320
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100917248 unmapped: 12099584 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3bf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100917248 unmapped: 12099584 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100917248 unmapped: 12099584 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100917248 unmapped: 12099584 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100917248 unmapped: 12099584 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3bf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.064629555s of 12.113365173s, submitted: 14
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1100573 data_alloc: 218103808 data_used: 4796416
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100917248 unmapped: 12099584 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3bf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100917248 unmapped: 12099584 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100917248 unmapped: 12099584 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100917248 unmapped: 12099584 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100917248 unmapped: 12099584 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1100441 data_alloc: 218103808 data_used: 4796416
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100917248 unmapped: 12099584 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100917248 unmapped: 12099584 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3bf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 104062976 unmapped: 13164544 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2417cc00 session 0x55fb26649680
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100925440 unmapped: 16302080 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100925440 unmapped: 16302080 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9d46000/0x0/0x4ffc00000, data 0x186964b/0x1926000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1152691 data_alloc: 218103808 data_used: 4796416
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100925440 unmapped: 16302080 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9d46000/0x0/0x4ffc00000, data 0x186964b/0x1926000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100925440 unmapped: 16302080 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100933632 unmapped: 16293888 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 13.373836517s of 13.440272331s, submitted: 18
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb25c3c400 session 0x55fb240be1e0
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100974592 unmapped: 16252928 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9d46000/0x0/0x4ffc00000, data 0x186964b/0x1926000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100982784 unmapped: 16244736 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1179880 data_alloc: 218103808 data_used: 8491008
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 101285888 unmapped: 15941632 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 101965824 unmapped: 15261696 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9d45000/0x0/0x4ffc00000, data 0x186966e/0x1927000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 101965824 unmapped: 15261696 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 101965824 unmapped: 15261696 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 101965824 unmapped: 15261696 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192648 data_alloc: 234881024 data_used: 10371072
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 101965824 unmapped: 15261696 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9d45000/0x0/0x4ffc00000, data 0x186966e/0x1927000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 101965824 unmapped: 15261696 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 101965824 unmapped: 15261696 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9d45000/0x0/0x4ffc00000, data 0x186966e/0x1927000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 101965824 unmapped: 15261696 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 101965824 unmapped: 15261696 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192648 data_alloc: 234881024 data_used: 10371072
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 101965824 unmapped: 15261696 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.659880638s of 12.670410156s, submitted: 4
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106905600 unmapped: 10321920 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 9469952 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 9469952 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9049000/0x0/0x4ffc00000, data 0x255c66e/0x261a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 9469952 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1313916 data_alloc: 234881024 data_used: 12308480
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 9469952 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 9469952 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9049000/0x0/0x4ffc00000, data 0x255c66e/0x261a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106782720 unmapped: 10444800 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106782720 unmapped: 10444800 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106782720 unmapped: 10444800 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1306724 data_alloc: 234881024 data_used: 12316672
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106782720 unmapped: 10444800 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106790912 unmapped: 10436608 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f904f000/0x0/0x4ffc00000, data 0x255f66e/0x261d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106790912 unmapped: 10436608 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106790912 unmapped: 10436608 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106790912 unmapped: 10436608 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1306724 data_alloc: 234881024 data_used: 12316672
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106790912 unmapped: 10436608 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106790912 unmapped: 10436608 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f904f000/0x0/0x4ffc00000, data 0x255f66e/0x261d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106790912 unmapped: 10436608 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106790912 unmapped: 10436608 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106799104 unmapped: 10428416 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1306724 data_alloc: 234881024 data_used: 12316672
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106799104 unmapped: 10428416 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106799104 unmapped: 10428416 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106799104 unmapped: 10428416 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f904f000/0x0/0x4ffc00000, data 0x255f66e/0x261d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106815488 unmapped: 10412032 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106815488 unmapped: 10412032 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308244 data_alloc: 234881024 data_used: 12402688
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106823680 unmapped: 10403840 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 25.410940170s of 25.654003143s, submitted: 125
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106831872 unmapped: 10395648 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb26b42c00 session 0x55fb26da41e0
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2435d000 session 0x55fb24e9bc20
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 102948864 unmapped: 14278656 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9ccc000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 102948864 unmapped: 14278656 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 102948864 unmapped: 14278656 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1112810 data_alloc: 218103808 data_used: 4796416
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 102948864 unmapped: 14278656 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 102948864 unmapped: 14278656 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9ccc000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 102948864 unmapped: 14278656 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 102948864 unmapped: 14278656 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 102948864 unmapped: 14278656 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9ccc000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1112810 data_alloc: 218103808 data_used: 4796416
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 102948864 unmapped: 14278656 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 102948864 unmapped: 14278656 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 102948864 unmapped: 14278656 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 102948864 unmapped: 14278656 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 102948864 unmapped: 14278656 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1112810 data_alloc: 218103808 data_used: 4796416
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9ccc000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 102948864 unmapped: 14278656 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 102948864 unmapped: 14278656 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 102948864 unmapped: 14278656 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 102948864 unmapped: 14278656 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 102948864 unmapped: 14278656 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9ccc000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1112810 data_alloc: 218103808 data_used: 4796416
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 102948864 unmapped: 14278656 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 102948864 unmapped: 14278656 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 102948864 unmapped: 14278656 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9ccc000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 102948864 unmapped: 14278656 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 102948864 unmapped: 14278656 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1112810 data_alloc: 218103808 data_used: 4796416
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 102948864 unmapped: 14278656 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 102948864 unmapped: 14278656 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9ccc000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 102948864 unmapped: 14278656 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 102948864 unmapped: 14278656 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 102948864 unmapped: 14278656 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb26c92000 session 0x55fb2422bc20
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2417cc00 session 0x55fb26acb2c0
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2435d000 session 0x55fb26acb860
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb25c3c400 session 0x55fb26acb4a0
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 28.730512619s of 28.779657364s, submitted: 23
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb26b42c00 session 0x55fb26acaf00
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb24bd1c00 session 0x55fb24e881e0
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb24bd1c00 session 0x55fb26aca1e0
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2417cc00 session 0x55fb24e88d20
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163827 data_alloc: 218103808 data_used: 4796416
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb25c3c400 session 0x55fb24e89c20
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 103374848 unmapped: 22249472 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 103374848 unmapped: 22249472 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9ddf000/0x0/0x4ffc00000, data 0x17cf65b/0x188d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 103374848 unmapped: 22249472 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb26b42c00 session 0x55fb26b30000
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 103374848 unmapped: 22249472 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb261b6c00 session 0x55fb26b301e0
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2417cc00 session 0x55fb26b30780
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 103383040 unmapped: 22241280 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163827 data_alloc: 218103808 data_used: 4796416
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 103383040 unmapped: 22241280 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 103383040 unmapped: 22241280 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb24bd1c00 session 0x55fb24f02b40
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb235d5c00 session 0x55fb268fe1e0
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 103391232 unmapped: 22233088 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9ddf000/0x0/0x4ffc00000, data 0x17cf65b/0x188d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 103391232 unmapped: 22233088 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105062400 unmapped: 20561920 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1195423 data_alloc: 234881024 data_used: 9515008
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105062400 unmapped: 20561920 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9ddf000/0x0/0x4ffc00000, data 0x17cf65b/0x188d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105062400 unmapped: 20561920 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9ddf000/0x0/0x4ffc00000, data 0x17cf65b/0x188d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105062400 unmapped: 20561920 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105062400 unmapped: 20561920 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105062400 unmapped: 20561920 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9ddf000/0x0/0x4ffc00000, data 0x17cf65b/0x188d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1195423 data_alloc: 234881024 data_used: 9515008
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 104669184 unmapped: 20955136 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 104669184 unmapped: 20955136 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 104677376 unmapped: 20946944 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 17.621421814s of 17.702753067s, submitted: 25
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 104677376 unmapped: 20946944 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105635840 unmapped: 19988480 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1205371 data_alloc: 234881024 data_used: 9601024
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9cc9000/0x0/0x4ffc00000, data 0x18e565b/0x19a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106078208 unmapped: 19546112 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105332736 unmapped: 20291584 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105332736 unmapped: 20291584 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9c3e000/0x0/0x4ffc00000, data 0x196f65b/0x1a2d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105332736 unmapped: 20291584 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105340928 unmapped: 20283392 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1216501 data_alloc: 234881024 data_used: 9588736
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105340928 unmapped: 20283392 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105472000 unmapped: 20152320 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105472000 unmapped: 20152320 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105472000 unmapped: 20152320 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9c20000/0x0/0x4ffc00000, data 0x198e65b/0x1a4c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9c20000/0x0/0x4ffc00000, data 0x198e65b/0x1a4c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.623050690s of 11.763713837s, submitted: 49
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105472000 unmapped: 20152320 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1216802 data_alloc: 234881024 data_used: 9592832
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105472000 unmapped: 20152320 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105472000 unmapped: 20152320 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105529344 unmapped: 20094976 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105529344 unmapped: 20094976 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105529344 unmapped: 20094976 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9c12000/0x0/0x4ffc00000, data 0x199c65b/0x1a5a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1217114 data_alloc: 234881024 data_used: 9592832
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105529344 unmapped: 20094976 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105529344 unmapped: 20094976 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb25fd8000 session 0x55fb240c32c0
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb240d5800 session 0x55fb24f023c0
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb235d5c00 session 0x55fb268fc960
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2417cc00 session 0x55fb26ae6960
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb24bd1c00 session 0x55fb26da43c0
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb25fd8000 session 0x55fb24bd4f00
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb263e8c00 session 0x55fb24bd5c20
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105783296 unmapped: 23511040 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb235d5c00 session 0x55fb24bd5a40
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2417cc00 session 0x55fb24bd43c0
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f93a0000/0x0/0x4ffc00000, data 0x220d66b/0x22cc000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105783296 unmapped: 23511040 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105791488 unmapped: 23502848 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1284570 data_alloc: 234881024 data_used: 9592832
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105791488 unmapped: 23502848 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f939d000/0x0/0x4ffc00000, data 0x221066b/0x22cf000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105791488 unmapped: 23502848 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105791488 unmapped: 23502848 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105791488 unmapped: 23502848 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105791488 unmapped: 23502848 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1316490 data_alloc: 234881024 data_used: 14340096
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 111575040 unmapped: 17719296 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112599040 unmapped: 16695296 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f939d000/0x0/0x4ffc00000, data 0x221066b/0x22cf000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112623616 unmapped: 16670720 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112623616 unmapped: 16670720 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f939d000/0x0/0x4ffc00000, data 0x221066b/0x22cf000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112656384 unmapped: 16637952 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1342938 data_alloc: 234881024 data_used: 18280448
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112656384 unmapped: 16637952 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 21.930198669s of 22.021116257s, submitted: 16
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112656384 unmapped: 16637952 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f939b000/0x0/0x4ffc00000, data 0x221166b/0x22d0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112689152 unmapped: 16605184 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112689152 unmapped: 16605184 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112689152 unmapped: 16605184 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1343466 data_alloc: 234881024 data_used: 18317312
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112803840 unmapped: 16490496 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f939b000/0x0/0x4ffc00000, data 0x221166b/0x22d0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114663424 unmapped: 14630912 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115384320 unmapped: 13910016 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115392512 unmapped: 13901824 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115458048 unmapped: 13836288 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1362194 data_alloc: 234881024 data_used: 18333696
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9264000/0x0/0x4ffc00000, data 0x233366b/0x23f2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115458048 unmapped: 13836288 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb261b7800 session 0x55fb25e2b860
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115458048 unmapped: 13836288 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9264000/0x0/0x4ffc00000, data 0x233366b/0x23f2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1800.1 total, 600.0 interval#012Cumulative writes: 9251 writes, 35K keys, 9251 commit groups, 1.0 writes per commit group, ingest: 0.03 GB, 0.01 MB/s#012Cumulative WAL: 9251 writes, 2253 syncs, 4.11 writes per sync, written: 0.03 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1447 writes, 4631 keys, 1447 commit groups, 1.0 writes per commit group, ingest: 5.55 MB, 0.01 MB/s#012Interval WAL: 1447 writes, 614 syncs, 2.36 writes per sync, written: 0.01 GB, 0.01 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.082611084s of 11.189125061s, submitted: 41
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115490816 unmapped: 13803520 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115490816 unmapped: 13803520 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115490816 unmapped: 13803520 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb24bd1c00 session 0x55fb24eda960
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1362210 data_alloc: 234881024 data_used: 18333696
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114360320 unmapped: 14934016 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb25fd8000 session 0x55fb268fbc20
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109035520 unmapped: 20258816 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9c0e000/0x0/0x4ffc00000, data 0x19a065b/0x1a5e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109035520 unmapped: 20258816 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109035520 unmapped: 20258816 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109035520 unmapped: 20258816 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1224948 data_alloc: 234881024 data_used: 9592832
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109035520 unmapped: 20258816 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109035520 unmapped: 20258816 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb25c3c400 session 0x55fb268d2780
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9c0e000/0x0/0x4ffc00000, data 0x19a065b/0x1a5e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.954633713s of 10.025353432s, submitted: 24
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106209280 unmapped: 23085056 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb235d5c00 session 0x55fb240c14a0
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106209280 unmapped: 23085056 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106209280 unmapped: 23085056 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1130496 data_alloc: 218103808 data_used: 4792320
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106209280 unmapped: 23085056 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106209280 unmapped: 23085056 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106209280 unmapped: 23085056 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3bf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3bf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106250240 unmapped: 23044096 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106250240 unmapped: 23044096 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1130496 data_alloc: 218103808 data_used: 4792320
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106250240 unmapped: 23044096 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106250240 unmapped: 23044096 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3bf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106250240 unmapped: 23044096 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106250240 unmapped: 23044096 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106250240 unmapped: 23044096 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1129757 data_alloc: 218103808 data_used: 4796416
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106250240 unmapped: 23044096 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106250240 unmapped: 23044096 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3bf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106250240 unmapped: 23044096 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106250240 unmapped: 23044096 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106250240 unmapped: 23044096 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1129757 data_alloc: 218103808 data_used: 4796416
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106250240 unmapped: 23044096 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106250240 unmapped: 23044096 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106250240 unmapped: 23044096 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3bf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106250240 unmapped: 23044096 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3bf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105627648 unmapped: 23666688 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1129757 data_alloc: 218103808 data_used: 4796416
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105627648 unmapped: 23666688 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3bf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105627648 unmapped: 23666688 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105627648 unmapped: 23666688 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105627648 unmapped: 23666688 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105627648 unmapped: 23666688 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1129757 data_alloc: 218103808 data_used: 4796416
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105627648 unmapped: 23666688 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 29.085247040s of 29.158304214s, submitted: 25
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2417cc00 session 0x55fb26f503c0
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105439232 unmapped: 23855104 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9dda000/0x0/0x4ffc00000, data 0x17d564b/0x1892000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105439232 unmapped: 23855104 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105439232 unmapped: 23855104 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105439232 unmapped: 23855104 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1178729 data_alloc: 218103808 data_used: 4796416
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105439232 unmapped: 23855104 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb24bd1c00 session 0x55fb24bd74a0
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105439232 unmapped: 23855104 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106323968 unmapped: 22970368 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9db6000/0x0/0x4ffc00000, data 0x17f964b/0x18b6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9db6000/0x0/0x4ffc00000, data 0x17f964b/0x18b6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106946560 unmapped: 22347776 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106946560 unmapped: 22347776 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1222909 data_alloc: 234881024 data_used: 10895360
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106946560 unmapped: 22347776 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9db6000/0x0/0x4ffc00000, data 0x17f964b/0x18b6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106946560 unmapped: 22347776 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9db6000/0x0/0x4ffc00000, data 0x17f964b/0x18b6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106946560 unmapped: 22347776 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106946560 unmapped: 22347776 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106946560 unmapped: 22347776 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1222909 data_alloc: 234881024 data_used: 10895360
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106954752 unmapped: 22339584 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106954752 unmapped: 22339584 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9db6000/0x0/0x4ffc00000, data 0x17f964b/0x18b6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106954752 unmapped: 22339584 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106954752 unmapped: 22339584 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9db6000/0x0/0x4ffc00000, data 0x17f964b/0x18b6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 17.328538895s of 17.374874115s, submitted: 11
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114114560 unmapped: 15179776 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1297901 data_alloc: 234881024 data_used: 12738560
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116293632 unmapped: 13000704 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9697000/0x0/0x4ffc00000, data 0x1f0a64b/0x1fc7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116301824 unmapped: 12992512 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9697000/0x0/0x4ffc00000, data 0x1f0a64b/0x1fc7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116301824 unmapped: 12992512 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9697000/0x0/0x4ffc00000, data 0x1f0a64b/0x1fc7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116301824 unmapped: 12992512 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116367360 unmapped: 12926976 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1298205 data_alloc: 234881024 data_used: 12746752
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116367360 unmapped: 12926976 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116367360 unmapped: 12926976 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116367360 unmapped: 12926976 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116375552 unmapped: 12918784 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9697000/0x0/0x4ffc00000, data 0x1f0a64b/0x1fc7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116375552 unmapped: 12918784 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1298509 data_alloc: 234881024 data_used: 12754944
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116375552 unmapped: 12918784 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9697000/0x0/0x4ffc00000, data 0x1f0a64b/0x1fc7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116375552 unmapped: 12918784 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116375552 unmapped: 12918784 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9697000/0x0/0x4ffc00000, data 0x1f0a64b/0x1fc7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9697000/0x0/0x4ffc00000, data 0x1f0a64b/0x1fc7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116375552 unmapped: 12918784 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116375552 unmapped: 12918784 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1298509 data_alloc: 234881024 data_used: 12754944
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116375552 unmapped: 12918784 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116391936 unmapped: 12902400 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116391936 unmapped: 12902400 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116391936 unmapped: 12902400 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9697000/0x0/0x4ffc00000, data 0x1f0a64b/0x1fc7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116400128 unmapped: 12894208 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1299269 data_alloc: 234881024 data_used: 12775424
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116400128 unmapped: 12894208 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116400128 unmapped: 12894208 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116408320 unmapped: 12886016 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9697000/0x0/0x4ffc00000, data 0x1f0a64b/0x1fc7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116408320 unmapped: 12886016 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9697000/0x0/0x4ffc00000, data 0x1f0a64b/0x1fc7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116408320 unmapped: 12886016 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1299269 data_alloc: 234881024 data_used: 12775424
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116408320 unmapped: 12886016 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9697000/0x0/0x4ffc00000, data 0x1f0a64b/0x1fc7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116408320 unmapped: 12886016 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2435d000 session 0x55fb24bd4b40
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb235d5c00 session 0x55fb24bd5680
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2417cc00 session 0x55fb271ac780
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2435d000 session 0x55fb271ac000
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 28.524518967s of 28.681346893s, submitted: 84
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb24bd1c00 session 0x55fb26c18d20
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb25c3c400 session 0x55fb24bd5860
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb235d5c00 session 0x55fb26b31c20
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114810880 unmapped: 14483456 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2417cc00 session 0x55fb24bd43c0
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2435d000 session 0x55fb26ae72c0
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9292000/0x0/0x4ffc00000, data 0x231c65b/0x23da000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114810880 unmapped: 14483456 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114810880 unmapped: 14483456 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1323404 data_alloc: 234881024 data_used: 12775424
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114810880 unmapped: 14483456 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114319360 unmapped: 14974976 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114319360 unmapped: 14974976 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb24bd1c00 session 0x55fb26f594a0
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9291000/0x0/0x4ffc00000, data 0x231c67e/0x23db000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114319360 unmapped: 14974976 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9291000/0x0/0x4ffc00000, data 0x231c67e/0x23db000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114360320 unmapped: 14934016 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1352973 data_alloc: 234881024 data_used: 16846848
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116736000 unmapped: 12558336 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116776960 unmapped: 12517376 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9291000/0x0/0x4ffc00000, data 0x231c67e/0x23db000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116809728 unmapped: 12484608 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116809728 unmapped: 12484608 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116809728 unmapped: 12484608 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1353581 data_alloc: 234881024 data_used: 16908288
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116809728 unmapped: 12484608 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116809728 unmapped: 12484608 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9291000/0x0/0x4ffc00000, data 0x231c67e/0x23db000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116842496 unmapped: 12451840 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116842496 unmapped: 12451840 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116842496 unmapped: 12451840 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 17.605432510s of 17.711872101s, submitted: 26
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1406263 data_alloc: 234881024 data_used: 16941056
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 118104064 unmapped: 11190272 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 118857728 unmapped: 10436608 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 118857728 unmapped: 10436608 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f8b0f000/0x0/0x4ffc00000, data 0x2a9e67e/0x2b5d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 118890496 unmapped: 10403840 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 118890496 unmapped: 10403840 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1416385 data_alloc: 234881024 data_used: 16928768
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f8b0f000/0x0/0x4ffc00000, data 0x2a9e67e/0x2b5d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 118890496 unmapped: 10403840 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 118890496 unmapped: 10403840 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 117432320 unmapped: 11862016 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 117440512 unmapped: 11853824 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 117440512 unmapped: 11853824 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1413273 data_alloc: 234881024 data_used: 16928768
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 117440512 unmapped: 11853824 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f8aeb000/0x0/0x4ffc00000, data 0x2ac267e/0x2b81000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 117440512 unmapped: 11853824 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.007748604s of 12.300899506s, submitted: 81
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 117440512 unmapped: 11853824 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb25e41400 session 0x55fb25e2ba40
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 117440512 unmapped: 11853824 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb235d5c00 session 0x55fb26c2a000
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f8aeb000/0x0/0x4ffc00000, data 0x2ac267e/0x2b81000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 113590272 unmapped: 15704064 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1298825 data_alloc: 234881024 data_used: 12820480
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 113590272 unmapped: 15704064 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 113590272 unmapped: 15704064 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f96a4000/0x0/0x4ffc00000, data 0x1f0a64b/0x1fc7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 113590272 unmapped: 15704064 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 113590272 unmapped: 15704064 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 113590272 unmapped: 15704064 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1298993 data_alloc: 234881024 data_used: 12820480
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 113590272 unmapped: 15704064 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 113590272 unmapped: 15704064 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f96a4000/0x0/0x4ffc00000, data 0x1f0a64b/0x1fc7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 113590272 unmapped: 15704064 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 113590272 unmapped: 15704064 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb263e9000 session 0x55fb26c192c0
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb261b7800 session 0x55fb24e9b680
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 113590272 unmapped: 15704064 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.708964348s of 12.890141487s, submitted: 37
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2417cc00 session 0x55fb238f32c0
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1145549 data_alloc: 218103808 data_used: 4796416
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110436352 unmapped: 18857984 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110436352 unmapped: 18857984 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3bf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110436352 unmapped: 18857984 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110436352 unmapped: 18857984 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110436352 unmapped: 18857984 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1145549 data_alloc: 218103808 data_used: 4796416
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110436352 unmapped: 18857984 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110436352 unmapped: 18857984 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3bf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110436352 unmapped: 18857984 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb25e58c00 session 0x55fb24e88780
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb25e59000 session 0x55fb24bb8d20
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110436352 unmapped: 18857984 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110436352 unmapped: 18857984 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb25e59c00 session 0x55fb24e890e0
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1145549 data_alloc: 218103808 data_used: 4796416
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110436352 unmapped: 18857984 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3bf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3bf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110436352 unmapped: 18857984 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb25ef4800 session 0x55fb24edad20
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110436352 unmapped: 18857984 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110436352 unmapped: 18857984 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110436352 unmapped: 18857984 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1145549 data_alloc: 218103808 data_used: 4796416
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110436352 unmapped: 18857984 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3bf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 16.401422501s of 16.425735474s, submitted: 9
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109821952 unmapped: 19472384 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109920256 unmapped: 19374080 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3bf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2417d400 session 0x55fb25e2c000
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110018560 unmapped: 19275776 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110018560 unmapped: 19275776 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3bf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1145549 data_alloc: 218103808 data_used: 4796416
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110018560 unmapped: 19275776 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110018560 unmapped: 19275776 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110018560 unmapped: 19275776 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110018560 unmapped: 19275776 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110018560 unmapped: 19275776 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3bf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb25ef4800 session 0x55fb26da4b40
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1166053 data_alloc: 218103808 data_used: 4796416
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110075904 unmapped: 19218432 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110075904 unmapped: 19218432 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110075904 unmapped: 19218432 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa220000/0x0/0x4ffc00000, data 0x138f64b/0x144c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110075904 unmapped: 19218432 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.438837051s of 13.030948639s, submitted: 230
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109223936 unmapped: 20070400 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2435d000 session 0x55fb26f503c0
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb24bd1c00 session 0x55fb26f50d20
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1165513 data_alloc: 218103808 data_used: 4796416
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109207552 unmapped: 20086784 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb26b42c00 session 0x55fb25107680
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb26b42c00 session 0x55fb25e2ba40
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109207552 unmapped: 20086784 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa21e000/0x0/0x4ffc00000, data 0x138f67e/0x144e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109305856 unmapped: 19988480 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa21e000/0x0/0x4ffc00000, data 0x138f67e/0x144e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109305856 unmapped: 19988480 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109305856 unmapped: 19988480 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1182596 data_alloc: 218103808 data_used: 6209536
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109330432 unmapped: 19963904 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109330432 unmapped: 19963904 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109330432 unmapped: 19963904 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa21e000/0x0/0x4ffc00000, data 0x138f67e/0x144e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2435d000 session 0x55fb24bb92c0
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2417d400 session 0x55fb271ac780
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109330432 unmapped: 19963904 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb24bd1c00 session 0x55fb268fd2c0
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107749376 unmapped: 21544960 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1153451 data_alloc: 218103808 data_used: 4792320
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107749376 unmapped: 21544960 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107749376 unmapped: 21544960 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3be000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107749376 unmapped: 21544960 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21536768 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3be000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21536768 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3be000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1153603 data_alloc: 218103808 data_used: 4796416
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107765760 unmapped: 21528576 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 16.980836868s of 17.077106476s, submitted: 31
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107765760 unmapped: 21528576 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107765760 unmapped: 21528576 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3be000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107765760 unmapped: 21528576 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107765760 unmapped: 21528576 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1153471 data_alloc: 218103808 data_used: 4796416
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107765760 unmapped: 21528576 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3be000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 108363776 unmapped: 20930560 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb25ef4800 session 0x55fb26db9860
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 108363776 unmapped: 20930560 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 108371968 unmapped: 20922368 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 108371968 unmapped: 20922368 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb25ef4800 session 0x55fb24bd7c20
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1162335 data_alloc: 218103808 data_used: 4796416
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2417d400 session 0x55fb26f501e0
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 108371968 unmapped: 20922368 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa2ea000/0x0/0x4ffc00000, data 0x12c564b/0x1382000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2435d000 session 0x55fb26f50f00
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.982028008s of 10.000374794s, submitted: 6
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb24bd1c00 session 0x55fb243d0960
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 108380160 unmapped: 20914176 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 108380160 unmapped: 20914176 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 108380160 unmapped: 20914176 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb26b42c00 session 0x55fb26db83c0
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2417d400 session 0x55fb26f510e0
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107429888 unmapped: 21864448 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1155300 data_alloc: 218103808 data_used: 4796416
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107438080 unmapped: 21856256 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3be000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107446272 unmapped: 21848064 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107446272 unmapped: 21848064 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107446272 unmapped: 21848064 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107446272 unmapped: 21848064 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3be000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1155300 data_alloc: 218103808 data_used: 4796416
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107446272 unmapped: 21848064 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107446272 unmapped: 21848064 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3be000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107446272 unmapped: 21848064 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107446272 unmapped: 21848064 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107446272 unmapped: 21848064 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3be000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1155300 data_alloc: 218103808 data_used: 4796416
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107446272 unmapped: 21848064 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107446272 unmapped: 21848064 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107446272 unmapped: 21848064 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107446272 unmapped: 21848064 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107446272 unmapped: 21848064 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3be000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1155300 data_alloc: 218103808 data_used: 4796416
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107454464 unmapped: 21839872 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107462656 unmapped: 21831680 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107462656 unmapped: 21831680 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107462656 unmapped: 21831680 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107462656 unmapped: 21831680 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1155300 data_alloc: 218103808 data_used: 4796416
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3be000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107462656 unmapped: 21831680 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107462656 unmapped: 21831680 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107462656 unmapped: 21831680 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107462656 unmapped: 21831680 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3be000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107470848 unmapped: 21823488 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 29.154647827s of 29.205055237s, submitted: 16
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2435d000 session 0x55fb26da4960
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1196112 data_alloc: 218103808 data_used: 4796416
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107470848 unmapped: 24977408 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107470848 unmapped: 24977408 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107479040 unmapped: 24969216 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9dbf000/0x0/0x4ffc00000, data 0x17f064b/0x18ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107479040 unmapped: 24969216 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107479040 unmapped: 24969216 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1196112 data_alloc: 218103808 data_used: 4796416
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107479040 unmapped: 24969216 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9dbf000/0x0/0x4ffc00000, data 0x17f064b/0x18ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb24bd1c00 session 0x55fb26c2b2c0
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107642880 unmapped: 24805376 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107642880 unmapped: 24805376 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9d9a000/0x0/0x4ffc00000, data 0x181466e/0x18d2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109158400 unmapped: 23289856 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109158400 unmapped: 23289856 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1243041 data_alloc: 234881024 data_used: 11091968
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9d9a000/0x0/0x4ffc00000, data 0x181466e/0x18d2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109158400 unmapped: 23289856 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109158400 unmapped: 23289856 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109158400 unmapped: 23289856 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109158400 unmapped: 23289856 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb25ef5800 session 0x55fb2719dc20
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109158400 unmapped: 23289856 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1243041 data_alloc: 234881024 data_used: 11091968
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109166592 unmapped: 23281664 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9d9a000/0x0/0x4ffc00000, data 0x181466e/0x18d2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109166592 unmapped: 23281664 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9d9a000/0x0/0x4ffc00000, data 0x181466e/0x18d2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109166592 unmapped: 23281664 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: mgrc ms_handle_reset ms_handle_reset con 0x55fb26150000
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: mgrc reconnect Terminating session with v2:192.168.122.100:6800/3885409716
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: mgrc reconnect Starting new session with [v2:192.168.122.100:6800/3885409716,v1:192.168.122.100:6801/3885409716]
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: mgrc handle_mgr_configure stats_period=5
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109133824 unmapped: 23314432 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 18.384727478s of 18.432491302s, submitted: 9
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9a06000/0x0/0x4ffc00000, data 0x1ba866e/0x1c66000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb263e9000 session 0x55fb268d21e0
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112214016 unmapped: 20234240 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1283175 data_alloc: 234881024 data_used: 11640832
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112279552 unmapped: 20168704 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112279552 unmapped: 20168704 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112279552 unmapped: 20168704 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112271360 unmapped: 20176896 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f99f8000/0x0/0x4ffc00000, data 0x1bb666e/0x1c74000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112271360 unmapped: 20176896 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f99f8000/0x0/0x4ffc00000, data 0x1bb666e/0x1c74000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1282131 data_alloc: 234881024 data_used: 11640832
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112271360 unmapped: 20176896 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112271360 unmapped: 20176896 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112271360 unmapped: 20176896 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f99f8000/0x0/0x4ffc00000, data 0x1bb666e/0x1c74000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112263168 unmapped: 20185088 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112263168 unmapped: 20185088 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.296810150s of 11.368459702s, submitted: 31
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1282111 data_alloc: 234881024 data_used: 11636736
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112263168 unmapped: 20185088 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112263168 unmapped: 20185088 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f99f8000/0x0/0x4ffc00000, data 0x1bb666e/0x1c74000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2892dc00 session 0x55fb271ad680
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb25ef4800 session 0x55fb268fe000
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112263168 unmapped: 20185088 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb24bd1c00 session 0x55fb25e2a1e0
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109723648 unmapped: 22724608 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109723648 unmapped: 22724608 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9fae000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163642 data_alloc: 218103808 data_used: 4792320
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109723648 unmapped: 22724608 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109723648 unmapped: 22724608 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109772800 unmapped: 22675456 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109764608 unmapped: 22683648 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109764608 unmapped: 22683648 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.109940529s of 10.186728477s, submitted: 30
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163202 data_alloc: 218103808 data_used: 4796416
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109764608 unmapped: 22683648 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109764608 unmapped: 22683648 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109764608 unmapped: 22683648 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109764608 unmapped: 22683648 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109764608 unmapped: 22683648 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163070 data_alloc: 218103808 data_used: 4796416
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109764608 unmapped: 22683648 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109764608 unmapped: 22683648 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109764608 unmapped: 22683648 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109764608 unmapped: 22683648 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109764608 unmapped: 22683648 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163070 data_alloc: 218103808 data_used: 4796416
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109764608 unmapped: 22683648 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109764608 unmapped: 22683648 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109764608 unmapped: 22683648 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109764608 unmapped: 22683648 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109764608 unmapped: 22683648 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163070 data_alloc: 218103808 data_used: 4796416
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109772800 unmapped: 22675456 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 15.609736443s of 15.626793861s, submitted: 5
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb25ef5800 session 0x55fb268d32c0
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110092288 unmapped: 26034176 heap: 136126464 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110092288 unmapped: 26034176 heap: 136126464 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110092288 unmapped: 26034176 heap: 136126464 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f99b7000/0x0/0x4ffc00000, data 0x17e864b/0x18a5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110092288 unmapped: 26034176 heap: 136126464 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1209438 data_alloc: 218103808 data_used: 4796416
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb25ef5800 session 0x55fb26649e00
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110403584 unmapped: 25722880 heap: 136126464 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110403584 unmapped: 25722880 heap: 136126464 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2417d400 session 0x55fb268f6960
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110706688 unmapped: 25419776 heap: 136126464 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9993000/0x0/0x4ffc00000, data 0x180c64b/0x18c9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112590848 unmapped: 23535616 heap: 136126464 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112590848 unmapped: 23535616 heap: 136126464 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9993000/0x0/0x4ffc00000, data 0x180c64b/0x18c9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1251390 data_alloc: 234881024 data_used: 10969088
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112590848 unmapped: 23535616 heap: 136126464 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112590848 unmapped: 23535616 heap: 136126464 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112590848 unmapped: 23535616 heap: 136126464 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112590848 unmapped: 23535616 heap: 136126464 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112590848 unmapped: 23535616 heap: 136126464 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9993000/0x0/0x4ffc00000, data 0x180c64b/0x18c9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9993000/0x0/0x4ffc00000, data 0x180c64b/0x18c9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1251390 data_alloc: 234881024 data_used: 10969088
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112590848 unmapped: 23535616 heap: 136126464 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112590848 unmapped: 23535616 heap: 136126464 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9993000/0x0/0x4ffc00000, data 0x180c64b/0x18c9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 16.847808838s of 16.891704559s, submitted: 6
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb263e9000 session 0x55fb240c2f00
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112812032 unmapped: 26992640 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115073024 unmapped: 24731648 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115073024 unmapped: 24731648 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1386308 data_alloc: 234881024 data_used: 11198464
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115712000 unmapped: 24092672 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f8877000/0x0/0x4ffc00000, data 0x292864b/0x29e5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2892d800 session 0x55fb240c10e0
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115712000 unmapped: 24092672 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2892d400 session 0x55fb271acd20
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2417d400 session 0x55fb26a99c20
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb25ef5800 session 0x55fb268fe1e0
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115712000 unmapped: 24092672 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115712000 unmapped: 24092672 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 119521280 unmapped: 20283392 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1471204 data_alloc: 234881024 data_used: 23625728
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 125157376 unmapped: 14647296 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f8874000/0x0/0x4ffc00000, data 0x292b64b/0x29e8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 125157376 unmapped: 14647296 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 125157376 unmapped: 14647296 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 125157376 unmapped: 14647296 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 125157376 unmapped: 14647296 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f8874000/0x0/0x4ffc00000, data 0x292b64b/0x29e8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1472076 data_alloc: 234881024 data_used: 23629824
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 125157376 unmapped: 14647296 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f8874000/0x0/0x4ffc00000, data 0x292b64b/0x29e8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 125157376 unmapped: 14647296 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 125157376 unmapped: 14647296 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 125157376 unmapped: 14647296 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 125214720 unmapped: 14589952 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 17.025381088s of 17.272668839s, submitted: 73
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f84cf000/0x0/0x4ffc00000, data 0x2ca464b/0x2d61000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1526224 data_alloc: 234881024 data_used: 23797760
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 129245184 unmapped: 10559488 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 129335296 unmapped: 10469376 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 129417216 unmapped: 10387456 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 129417216 unmapped: 10387456 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f814b000/0x0/0x4ffc00000, data 0x305364b/0x3110000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 129433600 unmapped: 10371072 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1530806 data_alloc: 234881024 data_used: 23859200
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 129433600 unmapped: 10371072 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 129433600 unmapped: 10371072 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 129458176 unmapped: 10346496 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f8149000/0x0/0x4ffc00000, data 0x305664b/0x3113000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 129490944 unmapped: 10313728 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 129490944 unmapped: 10313728 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1528822 data_alloc: 234881024 data_used: 23859200
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 129490944 unmapped: 10313728 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f8149000/0x0/0x4ffc00000, data 0x305664b/0x3113000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 129490944 unmapped: 10313728 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f8149000/0x0/0x4ffc00000, data 0x305664b/0x3113000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 129490944 unmapped: 10313728 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f8149000/0x0/0x4ffc00000, data 0x305664b/0x3113000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 129490944 unmapped: 10313728 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 129490944 unmapped: 10313728 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1528822 data_alloc: 234881024 data_used: 23859200
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 15.850658417s of 15.982189178s, submitted: 58
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 129490944 unmapped: 10313728 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb263e9000 session 0x55fb26ae6b40
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2892d400 session 0x55fb25e2cb40
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 119799808 unmapped: 20004864 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 119799808 unmapped: 20004864 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f91ec000/0x0/0x4ffc00000, data 0x1c3d64b/0x1cfa000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 119799808 unmapped: 20004864 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f91ec000/0x0/0x4ffc00000, data 0x1c3d64b/0x1cfa000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 119799808 unmapped: 20004864 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1305206 data_alloc: 234881024 data_used: 11198464
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 119799808 unmapped: 20004864 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 119799808 unmapped: 20004864 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f91ec000/0x0/0x4ffc00000, data 0x1c3d64b/0x1cfa000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 119799808 unmapped: 20004864 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb25ef4800 session 0x55fb25107c20
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb24bd1c00 session 0x55fb268fad20
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115548160 unmapped: 24256512 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2417d400 session 0x55fb24e894a0
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115556352 unmapped: 24248320 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1182518 data_alloc: 218103808 data_used: 4796416
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 24428544 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 24428544 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 24428544 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 24428544 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 24428544 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1182518 data_alloc: 218103808 data_used: 4796416
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 24428544 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 24428544 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 24428544 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 24428544 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 24428544 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1182518 data_alloc: 218103808 data_used: 4796416
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 24428544 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 24428544 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 24428544 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 24428544 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 24428544 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1182518 data_alloc: 218103808 data_used: 4796416
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 24428544 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 24428544 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 24428544 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 24428544 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 24428544 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1182518 data_alloc: 218103808 data_used: 4796416
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 24428544 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 24428544 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 24428544 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 24428544 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb25ef5800 session 0x55fb24bd50e0
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb263e9000 session 0x55fb24f02000
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2892d400 session 0x55fb26ae7a40
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2892d400 session 0x55fb24bb85a0
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 33.909008026s of 33.959445953s, submitted: 21
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2417d400 session 0x55fb2422b680
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb24bd1c00 session 0x55fb26ae6000
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb25ef5800 session 0x55fb24bd5680
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115572736 unmapped: 24231936 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb263e9000 session 0x55fb24f03c20
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb263e9000 session 0x55fb24bd4960
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9b13000/0x0/0x4ffc00000, data 0x168c64b/0x1749000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1222243 data_alloc: 218103808 data_used: 4796416
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115580928 unmapped: 24223744 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115580928 unmapped: 24223744 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115580928 unmapped: 24223744 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9b13000/0x0/0x4ffc00000, data 0x168c64b/0x1749000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115580928 unmapped: 24223744 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2417d400 session 0x55fb26a972c0
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb24bd1c00 session 0x55fb240c32c0
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115580928 unmapped: 24223744 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb25ef5800 session 0x55fb240c21e0
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2892d400 session 0x55fb24e89c20
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1224048 data_alloc: 218103808 data_used: 4796416
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115572736 unmapped: 24231936 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115580928 unmapped: 24223744 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115761152 unmapped: 24043520 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115761152 unmapped: 24043520 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9b12000/0x0/0x4ffc00000, data 0x168c66e/0x174a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115761152 unmapped: 24043520 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1249716 data_alloc: 218103808 data_used: 8503296
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115761152 unmapped: 24043520 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115761152 unmapped: 24043520 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9b12000/0x0/0x4ffc00000, data 0x168c66e/0x174a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115761152 unmapped: 24043520 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115761152 unmapped: 24043520 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115769344 unmapped: 24035328 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1249716 data_alloc: 218103808 data_used: 8503296
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9b12000/0x0/0x4ffc00000, data 0x168c66e/0x174a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115769344 unmapped: 24035328 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115769344 unmapped: 24035328 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115769344 unmapped: 24035328 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 18.381628036s of 18.489994049s, submitted: 27
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 119668736 unmapped: 20135936 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 119840768 unmapped: 19963904 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1326434 data_alloc: 218103808 data_used: 8761344
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 119840768 unmapped: 19963904 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9225000/0x0/0x4ffc00000, data 0x1f6a66e/0x2028000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 119840768 unmapped: 19963904 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 119840768 unmapped: 19963904 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9225000/0x0/0x4ffc00000, data 0x1f6a66e/0x2028000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 119840768 unmapped: 19963904 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 119840768 unmapped: 19963904 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1326434 data_alloc: 218103808 data_used: 8761344
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 119840768 unmapped: 19963904 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 119840768 unmapped: 19963904 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 119840768 unmapped: 19963904 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9225000/0x0/0x4ffc00000, data 0x1f6a66e/0x2028000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 119840768 unmapped: 19963904 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 119840768 unmapped: 19963904 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1326434 data_alloc: 218103808 data_used: 8761344
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 119840768 unmapped: 19963904 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 119840768 unmapped: 19963904 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 119840768 unmapped: 19963904 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 15.395916939s of 15.577631950s, submitted: 77
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2892d400 session 0x55fb26c2a5a0
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9234000/0x0/0x4ffc00000, data 0x1f6a66e/0x2028000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 118964224 unmapped: 20840448 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2892c800 session 0x55fb25e2cb40
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1193304 data_alloc: 218103808 data_used: 4796416
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9fae000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9fae000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1193304 data_alloc: 218103808 data_used: 4796416
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9fae000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2435d000 session 0x55fb26c192c0
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9fae000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1193304 data_alloc: 218103808 data_used: 4796416
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9fae000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1193304 data_alloc: 218103808 data_used: 4796416
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9fae000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9fae000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9fae000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1193304 data_alloc: 218103808 data_used: 4796416
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9fae000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1193304 data_alloc: 218103808 data_used: 4796416
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9fae000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9fae000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1193304 data_alloc: 218103808 data_used: 4796416
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb25fd8000 session 0x55fb26f38000
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 35.909954071s of 36.011264801s, submitted: 36
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9fae000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115818496 unmapped: 23986176 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1193172 data_alloc: 218103808 data_used: 4796416
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115818496 unmapped: 23986176 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115818496 unmapped: 23986176 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115818496 unmapped: 23986176 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9fae000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115818496 unmapped: 23986176 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115818496 unmapped: 23986176 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1193172 data_alloc: 218103808 data_used: 4796416
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115818496 unmapped: 23986176 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115818496 unmapped: 23986176 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9fae000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115818496 unmapped: 23986176 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115818496 unmapped: 23986176 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9fae000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115826688 unmapped: 23977984 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1193172 data_alloc: 218103808 data_used: 4796416
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115826688 unmapped: 23977984 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115826688 unmapped: 23977984 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115826688 unmapped: 23977984 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115826688 unmapped: 23977984 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9fae000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115826688 unmapped: 23977984 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1193172 data_alloc: 218103808 data_used: 4796416
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115826688 unmapped: 23977984 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115826688 unmapped: 23977984 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115826688 unmapped: 23977984 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9fae000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115834880 unmapped: 23969792 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9fae000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2892dc00 session 0x55fb26da45a0
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115834880 unmapped: 23969792 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 21.256072998s of 21.260541916s, submitted: 1
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115834880 unmapped: 23969792 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115834880 unmapped: 23969792 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115834880 unmapped: 23969792 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115834880 unmapped: 23969792 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115834880 unmapped: 23969792 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115834880 unmapped: 23969792 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115834880 unmapped: 23969792 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115834880 unmapped: 23969792 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115834880 unmapped: 23969792 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115834880 unmapped: 23969792 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.078483582s of 10.083856583s, submitted: 1
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192880 data_alloc: 218103808 data_used: 4796416
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115834880 unmapped: 23969792 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115834880 unmapped: 23969792 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115834880 unmapped: 23969792 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115834880 unmapped: 23969792 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115843072 unmapped: 23961600 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192896 data_alloc: 218103808 data_used: 4792320
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115843072 unmapped: 23961600 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115843072 unmapped: 23961600 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115843072 unmapped: 23961600 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115843072 unmapped: 23961600 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115843072 unmapped: 23961600 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192896 data_alloc: 218103808 data_used: 4792320
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115843072 unmapped: 23961600 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115851264 unmapped: 23953408 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115851264 unmapped: 23953408 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115851264 unmapped: 23953408 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115851264 unmapped: 23953408 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192896 data_alloc: 218103808 data_used: 4792320
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115851264 unmapped: 23953408 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115810304 unmapped: 23994368 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 16.716539383s of 16.728366852s, submitted: 3
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: do_command 'config diff' '{prefix=config diff}'
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: do_command 'config show' '{prefix=config show}'
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: do_command 'counter dump' '{prefix=counter dump}'
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: do_command 'counter schema' '{prefix=counter schema}'
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115564544 unmapped: 24240128 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115302400 unmapped: 24502272 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115712000 unmapped: 24092672 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:17:30 np0005548916 ceph-osd[77465]: do_command 'log dump' '{prefix=log dump}'
Dec  6 05:17:31 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr services"} v 0)
Dec  6 05:17:31 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4105181392' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Dec  6 05:17:31 np0005548916 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec  6 05:17:31 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:17:31 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:17:31 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:17:31.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:17:31 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:17:31 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:17:31 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:17:31.277 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:17:31 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr versions"} v 0)
Dec  6 05:17:31 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/519140837' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Dec  6 05:17:31 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon stat"} v 0)
Dec  6 05:17:31 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2058821366' entity='client.admin' cmd=[{"prefix": "mon stat"}]: dispatch
Dec  6 05:17:32 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:17:31 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:17:32 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:17:31 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:17:32 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:17:31 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:17:32 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:17:32 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:17:32 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "node ls"} v 0)
Dec  6 05:17:32 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1815850061' entity='client.admin' cmd=[{"prefix": "node ls"}]: dispatch
Dec  6 05:17:33 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:17:33 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:17:33 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:17:33.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:17:33 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:17:33 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:17:33 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:17:33.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:17:33 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush class ls"} v 0)
Dec  6 05:17:33 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3149206455' entity='client.admin' cmd=[{"prefix": "osd crush class ls"}]: dispatch
Dec  6 05:17:33 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush dump"} v 0)
Dec  6 05:17:33 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2924573047' entity='client.admin' cmd=[{"prefix": "osd crush dump"}]: dispatch
Dec  6 05:17:33 np0005548916 podman[240544]: 2025-12-06 10:17:33.762939613 +0000 UTC m=+0.062277041 container health_status ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, container_name=multipathd)
Dec  6 05:17:33 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "log last", "channel": "cephadm", "format": "json-pretty"} v 0)
Dec  6 05:17:33 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2910934850' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm", "format": "json-pretty"}]: dispatch
Dec  6 05:17:33 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:17:34 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush rule ls"} v 0)
Dec  6 05:17:34 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1996211133' entity='client.admin' cmd=[{"prefix": "osd crush rule ls"}]: dispatch
Dec  6 05:17:34 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr dump", "format": "json-pretty"} v 0)
Dec  6 05:17:34 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/745328308' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json-pretty"}]: dispatch
Dec  6 05:17:34 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush show-tunables"} v 0)
Dec  6 05:17:34 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3333722021' entity='client.admin' cmd=[{"prefix": "osd crush show-tunables"}]: dispatch
Dec  6 05:17:34 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr metadata", "format": "json-pretty"} v 0)
Dec  6 05:17:34 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3551154037' entity='client.admin' cmd=[{"prefix": "mgr metadata", "format": "json-pretty"}]: dispatch
Dec  6 05:17:34 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush tree", "show_shadow": true} v 0)
Dec  6 05:17:34 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3789003523' entity='client.admin' cmd=[{"prefix": "osd crush tree", "show_shadow": true}]: dispatch
Dec  6 05:17:35 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:17:35 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:17:35 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:17:35 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:17:35 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:17:35.298 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:17:35 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:17:35.298 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:17:35 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr module ls", "format": "json-pretty"} v 0)
Dec  6 05:17:35 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2063758053' entity='client.admin' cmd=[{"prefix": "mgr module ls", "format": "json-pretty"}]: dispatch
Dec  6 05:17:35 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd erasure-code-profile ls"} v 0)
Dec  6 05:17:35 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1968791473' entity='client.admin' cmd=[{"prefix": "osd erasure-code-profile ls"}]: dispatch
Dec  6 05:17:35 np0005548916 systemd[1]: Starting Hostname Service...
Dec  6 05:17:35 np0005548916 systemd[1]: Started Hostname Service.
Dec  6 05:17:35 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr services", "format": "json-pretty"} v 0)
Dec  6 05:17:35 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1457285215' entity='client.admin' cmd=[{"prefix": "mgr services", "format": "json-pretty"}]: dispatch
Dec  6 05:17:35 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd metadata"} v 0)
Dec  6 05:17:35 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/894360360' entity='client.admin' cmd=[{"prefix": "osd metadata"}]: dispatch
Dec  6 05:17:36 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr stat", "format": "json-pretty"} v 0)
Dec  6 05:17:36 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1794527057' entity='client.admin' cmd=[{"prefix": "mgr stat", "format": "json-pretty"}]: dispatch
Dec  6 05:17:36 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd utilization"} v 0)
Dec  6 05:17:36 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/987939842' entity='client.admin' cmd=[{"prefix": "osd utilization"}]: dispatch
Dec  6 05:17:36 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr versions", "format": "json-pretty"} v 0)
Dec  6 05:17:36 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/558961422' entity='client.admin' cmd=[{"prefix": "mgr versions", "format": "json-pretty"}]: dispatch
Dec  6 05:17:37 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:17:36 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:17:37 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:17:36 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:17:37 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:17:36 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:17:37 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:17:37 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:17:37 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:17:37 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:17:37 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:17:37 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:17:37 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:17:37.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:17:37 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:17:37.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:17:38 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "quorum_status"} v 0)
Dec  6 05:17:38 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2629627461' entity='client.admin' cmd=[{"prefix": "quorum_status"}]: dispatch
Dec  6 05:17:38 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "versions"} v 0)
Dec  6 05:17:38 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2453809493' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch
Dec  6 05:17:38 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:17:39 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "health", "detail": "detail", "format": "json-pretty"} v 0)
Dec  6 05:17:39 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1616168784' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail", "format": "json-pretty"}]: dispatch
Dec  6 05:17:39 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:17:39 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:17:39 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:17:39 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:17:39.376 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:17:39 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:17:39 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:17:39.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:17:39 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec  6 05:17:39 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec  6 05:17:39 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd tree", "format": "json-pretty"} v 0)
Dec  6 05:17:39 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2688295501' entity='client.admin' cmd=[{"prefix": "osd tree", "format": "json-pretty"}]: dispatch
Dec  6 05:17:39 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Dec  6 05:17:39 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Dec  6 05:17:39 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec  6 05:17:39 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec  6 05:17:40 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Dec  6 05:17:40 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Dec  6 05:17:40 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec  6 05:17:40 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec  6 05:17:40 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Dec  6 05:17:40 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Dec  6 05:17:40 np0005548916 ceph-mon[79770]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #61. Immutable memtables: 0.
Dec  6 05:17:40 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:17:40.834504) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec  6 05:17:40 np0005548916 ceph-mon[79770]: rocksdb: [db/flush_job.cc:856] [default] [JOB 35] Flushing memtable with next log file: 61
Dec  6 05:17:40 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016260834596, "job": 35, "event": "flush_started", "num_memtables": 1, "num_entries": 2599, "num_deletes": 251, "total_data_size": 6459693, "memory_usage": 6555664, "flush_reason": "Manual Compaction"}
Dec  6 05:17:40 np0005548916 ceph-mon[79770]: rocksdb: [db/flush_job.cc:885] [default] [JOB 35] Level-0 flush table #62: started
Dec  6 05:17:40 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016260854665, "cf_name": "default", "job": 35, "event": "table_file_creation", "file_number": 62, "file_size": 4202135, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 31202, "largest_seqno": 33796, "table_properties": {"data_size": 4191058, "index_size": 6931, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3077, "raw_key_size": 26531, "raw_average_key_size": 21, "raw_value_size": 4167766, "raw_average_value_size": 3421, "num_data_blocks": 296, "num_entries": 1218, "num_filter_entries": 1218, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765016059, "oldest_key_time": 1765016059, "file_creation_time": 1765016260, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79d3ff52-4bd2-4fdc-8b55-33dd9c53d8e0", "db_session_id": "1TK25AVRA1WQDS4JHM8T", "orig_file_number": 62, "seqno_to_time_mapping": "N/A"}}
Dec  6 05:17:40 np0005548916 ceph-mon[79770]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 35] Flush lasted 20218 microseconds, and 8487 cpu microseconds.
Dec  6 05:17:40 np0005548916 ceph-mon[79770]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  6 05:17:40 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:17:40.854716) [db/flush_job.cc:967] [default] [JOB 35] Level-0 flush table #62: 4202135 bytes OK
Dec  6 05:17:40 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:17:40.854743) [db/memtable_list.cc:519] [default] Level-0 commit table #62 started
Dec  6 05:17:40 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:17:40.856182) [db/memtable_list.cc:722] [default] Level-0 commit table #62: memtable #1 done
Dec  6 05:17:40 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:17:40.856196) EVENT_LOG_v1 {"time_micros": 1765016260856192, "job": 35, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec  6 05:17:40 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:17:40.856213) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec  6 05:17:40 np0005548916 ceph-mon[79770]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 35] Try to delete WAL files size 6447513, prev total WAL file size 6447513, number of live WAL files 2.
Dec  6 05:17:40 np0005548916 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000058.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 05:17:40 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:17:40.857648) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730032353130' seq:72057594037927935, type:22 .. '7061786F730032373632' seq:0, type:0; will stop at (end)
Dec  6 05:17:40 np0005548916 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 36] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec  6 05:17:40 np0005548916 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 35 Base level 0, inputs: [62(4103KB)], [60(12MB)]
Dec  6 05:17:40 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016260857859, "job": 36, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [62], "files_L6": [60], "score": -1, "input_data_size": 17353577, "oldest_snapshot_seqno": -1}
Dec  6 05:17:40 np0005548916 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 36] Generated table #63: 6530 keys, 15136589 bytes, temperature: kUnknown
Dec  6 05:17:40 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016260931779, "cf_name": "default", "job": 36, "event": "table_file_creation", "file_number": 63, "file_size": 15136589, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 15093088, "index_size": 26045, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 16389, "raw_key_size": 167539, "raw_average_key_size": 25, "raw_value_size": 14975612, "raw_average_value_size": 2293, "num_data_blocks": 1047, "num_entries": 6530, "num_filter_entries": 6530, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765014012, "oldest_key_time": 0, "file_creation_time": 1765016260, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79d3ff52-4bd2-4fdc-8b55-33dd9c53d8e0", "db_session_id": "1TK25AVRA1WQDS4JHM8T", "orig_file_number": 63, "seqno_to_time_mapping": "N/A"}}
Dec  6 05:17:40 np0005548916 ceph-mon[79770]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  6 05:17:40 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:17:40.932191) [db/compaction/compaction_job.cc:1663] [default] [JOB 36] Compacted 1@0 + 1@6 files to L6 => 15136589 bytes
Dec  6 05:17:40 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:17:40.933855) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 234.4 rd, 204.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(4.0, 12.5 +0.0 blob) out(14.4 +0.0 blob), read-write-amplify(7.7) write-amplify(3.6) OK, records in: 7051, records dropped: 521 output_compression: NoCompression
Dec  6 05:17:40 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:17:40.933878) EVENT_LOG_v1 {"time_micros": 1765016260933865, "job": 36, "event": "compaction_finished", "compaction_time_micros": 74038, "compaction_time_cpu_micros": 33777, "output_level": 6, "num_output_files": 1, "total_output_size": 15136589, "num_input_records": 7051, "num_output_records": 6530, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec  6 05:17:40 np0005548916 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000062.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 05:17:40 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016260934659, "job": 36, "event": "table_file_deletion", "file_number": 62}
Dec  6 05:17:40 np0005548916 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000060.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 05:17:40 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016260936606, "job": 36, "event": "table_file_deletion", "file_number": 60}
Dec  6 05:17:40 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:17:40.857519) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:17:40 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:17:40.936665) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:17:40 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:17:40.936670) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:17:40 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:17:40.936672) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:17:40 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:17:40.936674) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:17:40 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:17:40.936676) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:17:41 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "config dump"} v 0)
Dec  6 05:17:41 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1892473375' entity='client.admin' cmd=[{"prefix": "config dump"}]: dispatch
Dec  6 05:17:41 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:17:41 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:17:41 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:17:41.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:17:41 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:17:41 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:17:41 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:17:41.380 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:17:42 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:17:41 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:17:42 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:17:41 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:17:42 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:17:41 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:17:42 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:17:42 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:17:42 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "detail": "detail"} v 0)
Dec  6 05:17:42 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1044876254' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail"}]: dispatch
Dec  6 05:17:42 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df"} v 0)
Dec  6 05:17:42 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3138446401' entity='client.admin' cmd=[{"prefix": "df"}]: dispatch
Dec  6 05:17:42 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "fs dump"} v 0)
Dec  6 05:17:42 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1645793006' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch
Dec  6 05:17:43 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "fs ls"} v 0)
Dec  6 05:17:43 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/539913026' entity='client.admin' cmd=[{"prefix": "fs ls"}]: dispatch
Dec  6 05:17:43 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:17:43 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:17:43 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:17:43.381 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:17:43 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:17:43 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:17:43 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:17:43.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:17:43 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:17:44 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump"} v 0)
Dec  6 05:17:44 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/22305521' entity='client.admin' cmd=[{"prefix": "mon dump"}]: dispatch
Dec  6 05:17:45 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:17:45 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:17:45 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:17:45.384 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:17:45 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:17:45 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:17:45 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:17:45.386 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:17:45 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd blocklist ls"} v 0)
Dec  6 05:17:45 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/917006680' entity='client.admin' cmd=[{"prefix": "osd blocklist ls"}]: dispatch
Dec  6 05:17:46 np0005548916 nova_compute[228576]: 2025-12-06 10:17:46.470 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:17:46 np0005548916 nova_compute[228576]: 2025-12-06 10:17:46.471 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  6 05:17:46 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd dump"} v 0)
Dec  6 05:17:46 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1990391975' entity='client.admin' cmd=[{"prefix": "osd dump"}]: dispatch
Dec  6 05:17:46 np0005548916 ovs-appctl[242760]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Dec  6 05:17:46 np0005548916 ovs-appctl[242764]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Dec  6 05:17:46 np0005548916 ovs-appctl[242770]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Dec  6 05:17:47 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:17:46 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:17:47 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:17:46 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:17:47 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:17:46 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:17:47 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:17:47 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:17:47 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:17:47 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:17:47 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:17:47.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:17:47 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:17:47 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:17:47 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:17:47.388 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:17:48 np0005548916 nova_compute[228576]: 2025-12-06 10:17:48.470 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:17:48 np0005548916 nova_compute[228576]: 2025-12-06 10:17:48.471 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:17:48 np0005548916 nova_compute[228576]: 2025-12-06 10:17:48.500 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:17:48 np0005548916 nova_compute[228576]: 2025-12-06 10:17:48.501 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:17:48 np0005548916 nova_compute[228576]: 2025-12-06 10:17:48.501 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:17:48 np0005548916 nova_compute[228576]: 2025-12-06 10:17:48.501 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  6 05:17:48 np0005548916 nova_compute[228576]: 2025-12-06 10:17:48.502 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:17:48 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd pool ls", "detail": "detail"} v 0)
Dec  6 05:17:48 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1455478023' entity='client.admin' cmd=[{"prefix": "osd pool ls", "detail": "detail"}]: dispatch
Dec  6 05:17:48 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:17:49 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  6 05:17:49 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1693304027' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  6 05:17:49 np0005548916 nova_compute[228576]: 2025-12-06 10:17:49.084 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.582s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:17:49 np0005548916 nova_compute[228576]: 2025-12-06 10:17:49.267 228580 WARNING nova.virt.libvirt.driver [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  6 05:17:49 np0005548916 nova_compute[228576]: 2025-12-06 10:17:49.269 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4988MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  6 05:17:49 np0005548916 nova_compute[228576]: 2025-12-06 10:17:49.269 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:17:49 np0005548916 nova_compute[228576]: 2025-12-06 10:17:49.269 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:17:49 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd stat"} v 0)
Dec  6 05:17:49 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3508033125' entity='client.admin' cmd=[{"prefix": "osd stat"}]: dispatch
Dec  6 05:17:49 np0005548916 nova_compute[228576]: 2025-12-06 10:17:49.347 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  6 05:17:49 np0005548916 nova_compute[228576]: 2025-12-06 10:17:49.347 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  6 05:17:49 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:17:49 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:17:49 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:17:49.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:17:49 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:17:49 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:17:49 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:17:49.403 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:17:49 np0005548916 nova_compute[228576]: 2025-12-06 10:17:49.402 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:17:49 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  6 05:17:49 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3766028525' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  6 05:17:49 np0005548916 nova_compute[228576]: 2025-12-06 10:17:49.885 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.483s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:17:49 np0005548916 nova_compute[228576]: 2025-12-06 10:17:49.893 228580 DEBUG nova.compute.provider_tree [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Inventory has not changed in ProviderTree for provider: ff2f17cb-ff1d-4da7-9560-4be741380cb1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  6 05:17:49 np0005548916 nova_compute[228576]: 2025-12-06 10:17:49.914 228580 DEBUG nova.scheduler.client.report [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Inventory has not changed for provider ff2f17cb-ff1d-4da7-9560-4be741380cb1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  6 05:17:49 np0005548916 nova_compute[228576]: 2025-12-06 10:17:49.916 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  6 05:17:49 np0005548916 nova_compute[228576]: 2025-12-06 10:17:49.916 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.647s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:17:50 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "status"} v 0)
Dec  6 05:17:50 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2127390853' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Dec  6 05:17:50 np0005548916 nova_compute[228576]: 2025-12-06 10:17:50.916 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:17:50 np0005548916 nova_compute[228576]: 2025-12-06 10:17:50.917 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:17:51 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "time-sync-status"} v 0)
Dec  6 05:17:51 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3232134178' entity='client.admin' cmd=[{"prefix": "time-sync-status"}]: dispatch
Dec  6 05:17:51 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:17:51 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:17:51 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:17:51.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:17:51 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:17:51 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:17:51 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:17:51.406 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:17:51 np0005548916 nova_compute[228576]: 2025-12-06 10:17:51.471 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:17:51 np0005548916 nova_compute[228576]: 2025-12-06 10:17:51.472 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:17:51 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "config dump", "format": "json-pretty"} v 0)
Dec  6 05:17:51 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3949318305' entity='client.admin' cmd=[{"prefix": "config dump", "format": "json-pretty"}]: dispatch
Dec  6 05:17:52 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:17:51 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:17:52 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:17:51 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:17:52 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:17:51 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:17:52 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:17:52 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:17:52 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "detail": "detail", "format": "json-pretty"} v 0)
Dec  6 05:17:52 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2596899974' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail", "format": "json-pretty"}]: dispatch
Dec  6 05:17:52 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json-pretty"} v 0)
Dec  6 05:17:52 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/562050907' entity='client.admin' cmd=[{"prefix": "df", "format": "json-pretty"}]: dispatch
Dec  6 05:17:53 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "fs dump", "format": "json-pretty"} v 0)
Dec  6 05:17:53 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1958371454' entity='client.admin' cmd=[{"prefix": "fs dump", "format": "json-pretty"}]: dispatch
Dec  6 05:17:53 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:17:53 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:17:53 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:17:53 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:17:53.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:17:53 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:17:53 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:17:53.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:17:53 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "fs ls", "format": "json-pretty"} v 0)
Dec  6 05:17:53 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/762392159' entity='client.admin' cmd=[{"prefix": "fs ls", "format": "json-pretty"}]: dispatch
Dec  6 05:17:53 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:17:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:17:54.292 141446 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:17:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:17:54.294 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:17:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:17:54.295 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:17:54 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mds stat", "format": "json-pretty"} v 0)
Dec  6 05:17:54 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2059299300' entity='client.admin' cmd=[{"prefix": "mds stat", "format": "json-pretty"}]: dispatch
Dec  6 05:17:55 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json-pretty"} v 0)
Dec  6 05:17:55 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2704565734' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json-pretty"}]: dispatch
Dec  6 05:17:55 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:17:55 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:17:55 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:17:55.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:17:55 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:17:55 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:17:55 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:17:55.412 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:17:55 np0005548916 nova_compute[228576]: 2025-12-06 10:17:55.471 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:17:55 np0005548916 nova_compute[228576]: 2025-12-06 10:17:55.472 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  6 05:17:55 np0005548916 nova_compute[228576]: 2025-12-06 10:17:55.472 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  6 05:17:55 np0005548916 nova_compute[228576]: 2025-12-06 10:17:55.497 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  6 05:17:56 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd blocklist ls", "format": "json-pretty"} v 0)
Dec  6 05:17:56 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/897653303' entity='client.admin' cmd=[{"prefix": "osd blocklist ls", "format": "json-pretty"}]: dispatch
Dec  6 05:17:57 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:17:56 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:17:57 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:17:56 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:17:57 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:17:56 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:17:57 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:17:57 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:17:57 np0005548916 podman[244616]: 2025-12-06 10:17:57.036666316 +0000 UTC m=+0.180463259 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=ovn_controller)
Dec  6 05:17:57 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd dump", "format": "json-pretty"} v 0)
Dec  6 05:17:57 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3855571438' entity='client.admin' cmd=[{"prefix": "osd dump", "format": "json-pretty"}]: dispatch
Dec  6 05:17:57 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:17:57 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:17:57 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:17:57.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:17:57 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:17:57 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:17:57 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:17:57.414 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:17:57 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd numa-status", "format": "json-pretty"} v 0)
Dec  6 05:17:57 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3069230919' entity='client.admin' cmd=[{"prefix": "osd numa-status", "format": "json-pretty"}]: dispatch
Dec  6 05:17:58 np0005548916 nova_compute[228576]: 2025-12-06 10:17:58.490 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:17:58 np0005548916 virtqemud[228188]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Dec  6 05:17:58 np0005548916 systemd[1]: Starting Time & Date Service...
Dec  6 05:17:58 np0005548916 systemd[1]: Started Time & Date Service.
Dec  6 05:17:58 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd pool ls", "detail": "detail", "format": "json-pretty"} v 0)
Dec  6 05:17:58 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2207639809' entity='client.admin' cmd=[{"prefix": "osd pool ls", "detail": "detail", "format": "json-pretty"}]: dispatch
Dec  6 05:17:58 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:17:59 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd stat", "format": "json-pretty"} v 0)
Dec  6 05:17:59 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1008317242' entity='client.admin' cmd=[{"prefix": "osd stat", "format": "json-pretty"}]: dispatch
Dec  6 05:17:59 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:17:59 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:17:59 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:17:59 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:17:59.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:17:59 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:17:59 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:17:59.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:18:00 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "status", "format": "json-pretty"} v 0)
Dec  6 05:18:00 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2167113296' entity='client.admin' cmd=[{"prefix": "status", "format": "json-pretty"}]: dispatch
Dec  6 05:18:01 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "time-sync-status", "format": "json-pretty"} v 0)
Dec  6 05:18:01 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2123486085' entity='client.admin' cmd=[{"prefix": "time-sync-status", "format": "json-pretty"}]: dispatch
Dec  6 05:18:01 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:18:01 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:18:01 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:18:01 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:18:01.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:18:01 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:18:01 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:18:01.448 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:18:01 np0005548916 podman[245305]: 2025-12-06 10:18:01.760500116 +0000 UTC m=+0.064001066 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Dec  6 05:18:02 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:18:01 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:18:02 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:18:01 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:18:02 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:18:01 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:18:02 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:18:02 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:18:03 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:18:03 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:18:03 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:18:03.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:18:03 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:18:03 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:18:03 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:18:03.452 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:18:03 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:18:04 np0005548916 podman[245327]: 2025-12-06 10:18:04.752862265 +0000 UTC m=+0.059099734 container health_status ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Dec  6 05:18:05 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:18:05 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:18:05 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:18:05.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:18:05 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:18:05 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:18:05 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:18:05.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:18:07 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:18:06 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:18:07 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:18:06 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:18:07 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:18:06 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:18:07 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:18:07 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:18:07 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:18:07 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:18:07 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:18:07.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:18:07 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:18:07 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:18:07 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:18:07.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:18:08 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:18:09 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:18:09 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:18:09 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:18:09.460 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:18:09 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:18:09 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:18:09 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:18:09.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:18:11 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:18:11 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:18:11 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:18:11.462 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:18:11 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:18:11 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:18:11 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:18:11.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:18:12 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:18:11 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:18:12 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:18:11 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:18:12 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:18:11 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:18:12 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:18:12 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:18:13 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:18:13 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:18:13 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:18:13.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:18:13 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:18:13 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:18:13 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:18:13.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:18:13 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:18:15 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:18:15 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:18:15 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:18:15.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:18:15 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:18:15 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:18:15 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:18:15.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:18:17 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:18:16 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:18:17 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:18:16 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:18:17 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:18:16 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:18:17 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:18:17 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:18:17 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:18:17 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:18:17 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:18:17.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:18:17 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:18:17 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:18:17 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:18:17.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:18:18 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:18:19 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  6 05:18:19 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:18:19 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:18:19 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  6 05:18:19 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:18:19 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:18:19 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:18:19.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:18:19 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:18:19 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:18:19 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:18:19.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:18:21 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:18:21 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:18:21 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:18:21.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:18:21 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:18:21 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:18:21 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:18:21.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:18:22 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:18:21 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:18:22 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:18:21 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:18:22 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:18:21 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:18:22 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:18:22 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:18:23 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:18:23 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:18:23 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:18:23.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:18:23 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:18:23 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:18:23 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:18:23.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:18:23 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:18:24 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:18:24 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:18:25 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:18:25 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:18:25 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:18:25 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:18:25.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:18:25 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:18:25 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:18:25.487 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:18:27 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:18:27 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:18:27 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:18:27 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:18:27 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:18:27 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:18:27 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:18:27 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:18:27 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:18:27 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:18:27 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:18:27 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:18:27.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:18:27 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:18:27 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:18:27.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:18:27 np0005548916 podman[245491]: 2025-12-06 10:18:27.806097953 +0000 UTC m=+0.104111499 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Dec  6 05:18:28 np0005548916 systemd[1]: systemd-timedated.service: Deactivated successfully.
Dec  6 05:18:28 np0005548916 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Dec  6 05:18:28 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:18:29 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:18:29 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:18:29 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:18:29 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:18:29.493 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:18:29 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:18:29 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:18:29.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:18:31 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:18:31 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:18:31 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:18:31 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:18:31.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:18:31 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:18:31 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:18:31.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:18:32 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:18:31 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:18:32 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:18:32 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:18:32 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:18:32 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:18:32 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:18:32 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:18:32 np0005548916 podman[245524]: 2025-12-06 10:18:32.408730532 +0000 UTC m=+0.061057803 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true)
Dec  6 05:18:33 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:18:33 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:18:33 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:18:33 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:18:33.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:18:33 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:18:33 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:18:33.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:18:33 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:18:35 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:18:35 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:18:35 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:18:35 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:18:35 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:18:35.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:18:35 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:18:35.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:18:35 np0005548916 podman[245546]: 2025-12-06 10:18:35.747048837 +0000 UTC m=+0.055468065 container health_status ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec  6 05:18:37 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:18:36 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:18:37 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:18:36 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:18:37 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:18:36 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:18:37 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:18:37 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:18:37 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:18:37 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:18:37 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:18:37 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:18:37.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:18:37 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:18:37 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:18:37.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:18:38 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:18:39 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:18:39 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:18:39 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:18:39 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:18:39 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:18:39.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:18:39 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:18:39.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:18:41 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:18:41 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:18:41 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:18:41 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:18:41 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:18:41.511 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:18:41 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:18:41.511 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:18:42 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:18:41 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:18:42 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:18:41 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:18:42 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:18:41 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:18:42 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:18:42 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:18:43 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:18:43 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:18:43 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:18:43 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:18:43 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:18:43.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:18:43 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:18:43.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:18:43 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:18:45 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:18:45 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:18:45 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:18:45.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:18:45 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:18:45 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:18:45 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:18:45.518 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:18:47 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:18:46 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:18:47 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:18:46 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:18:47 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:18:46 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:18:47 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:18:47 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:18:47 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:18:47 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:18:47 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:18:47 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:18:47.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:18:47 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:18:47 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:18:47.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:18:48 np0005548916 nova_compute[228576]: 2025-12-06 10:18:48.470 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:18:48 np0005548916 nova_compute[228576]: 2025-12-06 10:18:48.471 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:18:48 np0005548916 nova_compute[228576]: 2025-12-06 10:18:48.472 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  6 05:18:48 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:18:49 np0005548916 nova_compute[228576]: 2025-12-06 10:18:49.470 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:18:49 np0005548916 nova_compute[228576]: 2025-12-06 10:18:49.495 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:18:49 np0005548916 nova_compute[228576]: 2025-12-06 10:18:49.495 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:18:49 np0005548916 nova_compute[228576]: 2025-12-06 10:18:49.496 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:18:49 np0005548916 nova_compute[228576]: 2025-12-06 10:18:49.496 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  6 05:18:49 np0005548916 nova_compute[228576]: 2025-12-06 10:18:49.496 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:18:49 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:18:49 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:18:49 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:18:49 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:18:49 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:18:49.524 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:18:49 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:18:49.524 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:18:49 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  6 05:18:49 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2553381759' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  6 05:18:49 np0005548916 nova_compute[228576]: 2025-12-06 10:18:49.953 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.457s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:18:50 np0005548916 nova_compute[228576]: 2025-12-06 10:18:50.126 228580 WARNING nova.virt.libvirt.driver [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  6 05:18:50 np0005548916 nova_compute[228576]: 2025-12-06 10:18:50.128 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5053MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  6 05:18:50 np0005548916 nova_compute[228576]: 2025-12-06 10:18:50.128 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:18:50 np0005548916 nova_compute[228576]: 2025-12-06 10:18:50.129 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:18:50 np0005548916 nova_compute[228576]: 2025-12-06 10:18:50.187 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  6 05:18:50 np0005548916 nova_compute[228576]: 2025-12-06 10:18:50.188 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  6 05:18:50 np0005548916 nova_compute[228576]: 2025-12-06 10:18:50.221 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:18:50 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  6 05:18:50 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/748120430' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  6 05:18:50 np0005548916 nova_compute[228576]: 2025-12-06 10:18:50.680 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.460s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:18:50 np0005548916 nova_compute[228576]: 2025-12-06 10:18:50.687 228580 DEBUG nova.compute.provider_tree [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Inventory has not changed in ProviderTree for provider: ff2f17cb-ff1d-4da7-9560-4be741380cb1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  6 05:18:50 np0005548916 nova_compute[228576]: 2025-12-06 10:18:50.706 228580 DEBUG nova.scheduler.client.report [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Inventory has not changed for provider ff2f17cb-ff1d-4da7-9560-4be741380cb1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  6 05:18:50 np0005548916 nova_compute[228576]: 2025-12-06 10:18:50.708 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  6 05:18:50 np0005548916 nova_compute[228576]: 2025-12-06 10:18:50.708 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.579s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:18:51 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:18:51 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:18:51 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:18:51 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:18:51.528 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:18:51 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:18:51 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:18:51.528 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:18:51 np0005548916 nova_compute[228576]: 2025-12-06 10:18:51.709 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:18:51 np0005548916 nova_compute[228576]: 2025-12-06 10:18:51.742 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:18:51 np0005548916 nova_compute[228576]: 2025-12-06 10:18:51.743 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:18:51 np0005548916 nova_compute[228576]: 2025-12-06 10:18:51.743 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:18:52 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:18:51 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:18:52 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:18:51 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:18:52 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:18:51 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:18:52 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:18:52 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:18:53 np0005548916 systemd-logind[788]: Session 55 logged out. Waiting for processes to exit.
Dec  6 05:18:53 np0005548916 systemd[1]: session-55.scope: Deactivated successfully.
Dec  6 05:18:53 np0005548916 systemd[1]: session-55.scope: Consumed 3min 447ms CPU time, 760.4M memory peak, read 308.9M from disk, written 70.4M to disk.
Dec  6 05:18:53 np0005548916 systemd-logind[788]: Removed session 55.
Dec  6 05:18:53 np0005548916 systemd-logind[788]: New session 56 of user zuul.
Dec  6 05:18:53 np0005548916 systemd[1]: Started Session 56 of User zuul.
Dec  6 05:18:53 np0005548916 nova_compute[228576]: 2025-12-06 10:18:53.471 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:18:53 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:18:53 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:18:53 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:18:53.530 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:18:53 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:18:53 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:18:53 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:18:53.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:18:53 np0005548916 systemd[1]: session-56.scope: Deactivated successfully.
Dec  6 05:18:53 np0005548916 systemd-logind[788]: Session 56 logged out. Waiting for processes to exit.
Dec  6 05:18:53 np0005548916 systemd-logind[788]: Removed session 56.
Dec  6 05:18:53 np0005548916 systemd-logind[788]: New session 57 of user zuul.
Dec  6 05:18:53 np0005548916 systemd[1]: Started Session 57 of User zuul.
Dec  6 05:18:53 np0005548916 systemd[1]: session-57.scope: Deactivated successfully.
Dec  6 05:18:53 np0005548916 systemd-logind[788]: Session 57 logged out. Waiting for processes to exit.
Dec  6 05:18:53 np0005548916 systemd-logind[788]: Removed session 57.
Dec  6 05:18:53 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:18:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:18:54.294 141446 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:18:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:18:54.296 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:18:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:18:54.296 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:18:55 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:18:55 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:18:55 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:18:55.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:18:55 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:18:55 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:18:55 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:18:55.536 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:18:57 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:18:56 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:18:57 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:18:56 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:18:57 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:18:56 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:18:57 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:18:57 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:18:57 np0005548916 nova_compute[228576]: 2025-12-06 10:18:57.473 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:18:57 np0005548916 nova_compute[228576]: 2025-12-06 10:18:57.473 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  6 05:18:57 np0005548916 nova_compute[228576]: 2025-12-06 10:18:57.473 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  6 05:18:57 np0005548916 nova_compute[228576]: 2025-12-06 10:18:57.495 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  6 05:18:57 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:18:57 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:18:57 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:18:57.536 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:18:57 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:18:57 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:18:57 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:18:57.539 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:18:58 np0005548916 podman[245733]: 2025-12-06 10:18:58.783176539 +0000 UTC m=+0.088540823 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec  6 05:18:58 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:18:59 np0005548916 nova_compute[228576]: 2025-12-06 10:18:59.485 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:18:59 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:18:59 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:18:59 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:18:59.539 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:18:59 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:18:59 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:18:59 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:18:59.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:19:01 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:19:01 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:19:01 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:19:01.542 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:19:01 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:19:01 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:19:01 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:19:01.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:19:02 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:19:01 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:19:02 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:19:01 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:19:02 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:19:01 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:19:02 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:19:02 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:19:02 np0005548916 podman[245763]: 2025-12-06 10:19:02.755072451 +0000 UTC m=+0.060278313 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true)
Dec  6 05:19:03 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:19:03 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:19:03 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:19:03.545 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:19:03 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:19:03 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:19:03 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:19:03.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:19:03 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:19:05 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:19:05 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:19:05 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:19:05.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:19:05 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:19:05 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:19:05 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:19:05.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:19:06 np0005548916 podman[245786]: 2025-12-06 10:19:06.757042309 +0000 UTC m=+0.063378360 container health_status ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec  6 05:19:07 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:19:06 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:19:07 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:19:07 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:19:07 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:19:07 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:19:07 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:19:07 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:19:07 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:19:07 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:19:07 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:19:07.552 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:19:07 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:19:07 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:19:07 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:19:07.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:19:08 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:19:09 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:19:09 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:19:09 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:19:09.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:19:09 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:19:09 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:19:09 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:19:09.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:19:11 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:19:11 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:19:11 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:19:11.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:19:11 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:19:11 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:19:11 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:19:11.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:19:12 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:19:11 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:19:12 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:19:11 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:19:12 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:19:11 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:19:12 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:19:12 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:19:13 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:19:13 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:19:13 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:19:13 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:19:13 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:19:13.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:19:13 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:19:13.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:19:13 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:19:15 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:19:15 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:19:15 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:19:15 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:19:15.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:19:15 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:19:15 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:19:15.564 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:19:17 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:19:16 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:19:17 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:19:16 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:19:17 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:19:16 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:19:17 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:19:17 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:19:17 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:19:17 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:19:17 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:19:17.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:19:17 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:19:17 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:19:17 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:19:17.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:19:18 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:19:19 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:19:19 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:19:19 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:19:19.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:19:19 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:19:19 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:19:19 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:19:19.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:19:21 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:19:21 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:19:21 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:19:21.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:19:21 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:19:21 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:19:21 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:19:21.573 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:19:22 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:19:21 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:19:22 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:19:21 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:19:22 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:19:21 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:19:22 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:19:22 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:19:23 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:19:23 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:19:23 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:19:23.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:19:23 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:19:23 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:19:23 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:19:23.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:19:23 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:19:25 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:19:25 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:19:25 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:19:25 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:19:25 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:19:25 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:19:25 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:19:25 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:19:25.578 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:19:25 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:19:25 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:19:25.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:19:26 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  6 05:19:26 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:19:26 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:19:26 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  6 05:19:27 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:19:26 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:19:27 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:19:26 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:19:27 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:19:26 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:19:27 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:19:27 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:19:27 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:19:27 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:19:27 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:19:27 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:19:27.581 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:19:27 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:19:27 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:19:27.581 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:19:28 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:19:29 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:19:29 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:19:29 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:19:29 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:19:29.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:19:29 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:19:29 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:19:29.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:19:29 np0005548916 podman[245993]: 2025-12-06 10:19:29.811866444 +0000 UTC m=+0.114789993 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec  6 05:19:31 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:19:31 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:19:31 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:19:31 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:19:31 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:19:31 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:19:31.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:19:31 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:19:31 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:19:31.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:19:32 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:19:31 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:19:32 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:19:31 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:19:32 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:19:31 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:19:32 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:19:32 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:19:33 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:19:33 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:19:33 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:19:33 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:19:33 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:19:33.589 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:19:33 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:19:33.589 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:19:33 np0005548916 podman[246046]: 2025-12-06 10:19:33.764473258 +0000 UTC m=+0.064155609 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec  6 05:19:33 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:19:35 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:19:35 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:19:35 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:19:35.592 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:19:35 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:19:35 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:19:35 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:19:35.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:19:35 np0005548916 ceph-osd[77465]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec  6 05:19:35 np0005548916 ceph-osd[77465]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 2400.1 total, 600.0 interval#012Cumulative writes: 10K writes, 41K keys, 10K commit groups, 1.0 writes per commit group, ingest: 0.03 GB, 0.01 MB/s#012Cumulative WAL: 10K writes, 2997 syncs, 3.65 writes per sync, written: 0.03 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1703 writes, 5619 keys, 1703 commit groups, 1.0 writes per commit group, ingest: 5.35 MB, 0.01 MB/s#012Interval WAL: 1703 writes, 744 syncs, 2.29 writes per sync, written: 0.01 GB, 0.01 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec  6 05:19:37 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:19:36 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:19:37 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:19:36 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:19:37 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:19:36 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:19:37 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:19:37 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:19:37 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:19:37 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:19:37 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:19:37 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:19:37.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:19:37 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:19:37 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:19:37.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:19:37 np0005548916 podman[246092]: 2025-12-06 10:19:37.745398884 +0000 UTC m=+0.055924135 container health_status ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251125, tcib_managed=true)
Dec  6 05:19:38 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:19:39 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:19:39 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:19:39 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:19:39 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:19:39.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:19:39 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:19:39 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:19:39.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:19:41 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:19:41 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:19:41 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:19:41.601 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:19:41 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:19:41 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:19:41 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:19:41.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:19:42 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:19:41 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:19:42 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:19:41 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:19:42 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:19:41 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:19:42 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:19:42 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:19:43 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:19:43 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:19:43 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:19:43.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:19:43 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:19:43 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:19:43 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:19:43.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:19:43 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:19:45 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:19:45 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:19:45 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:19:45.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:19:45 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:19:45 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:19:45 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:19:45.610 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:19:46 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec  6 05:19:46 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1961422206' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec  6 05:19:46 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec  6 05:19:46 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1961422206' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec  6 05:19:47 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:19:47 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:19:47 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:19:47 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:19:47 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:19:47 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:19:47 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:19:47 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:19:47 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:19:47 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:19:47 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:19:47.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:19:47 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:19:47 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:19:47 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:19:47.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:19:48 np0005548916 nova_compute[228576]: 2025-12-06 10:19:48.470 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:19:48 np0005548916 nova_compute[228576]: 2025-12-06 10:19:48.471 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  6 05:19:48 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:19:49 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:19:49 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:19:49 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:19:49.615 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:19:49 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:19:49 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:19:49 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:19:49.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:19:50 np0005548916 nova_compute[228576]: 2025-12-06 10:19:50.470 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:19:50 np0005548916 nova_compute[228576]: 2025-12-06 10:19:50.471 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:19:51 np0005548916 nova_compute[228576]: 2025-12-06 10:19:51.470 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:19:51 np0005548916 nova_compute[228576]: 2025-12-06 10:19:51.504 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:19:51 np0005548916 nova_compute[228576]: 2025-12-06 10:19:51.504 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:19:51 np0005548916 nova_compute[228576]: 2025-12-06 10:19:51.505 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:19:51 np0005548916 nova_compute[228576]: 2025-12-06 10:19:51.505 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  6 05:19:51 np0005548916 nova_compute[228576]: 2025-12-06 10:19:51.505 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:19:51 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:19:51 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:19:51 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:19:51.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:19:51 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:19:51 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:19:51 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:19:51.618 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:19:51 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  6 05:19:51 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3584782970' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  6 05:19:52 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:19:51 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:19:52 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:19:51 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:19:52 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:19:51 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:19:52 np0005548916 nova_compute[228576]: 2025-12-06 10:19:51.999 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.494s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:19:52 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:19:52 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:19:52 np0005548916 nova_compute[228576]: 2025-12-06 10:19:52.147 228580 WARNING nova.virt.libvirt.driver [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  6 05:19:52 np0005548916 nova_compute[228576]: 2025-12-06 10:19:52.149 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5172MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  6 05:19:52 np0005548916 nova_compute[228576]: 2025-12-06 10:19:52.149 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:19:52 np0005548916 nova_compute[228576]: 2025-12-06 10:19:52.149 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:19:52 np0005548916 nova_compute[228576]: 2025-12-06 10:19:52.249 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  6 05:19:52 np0005548916 nova_compute[228576]: 2025-12-06 10:19:52.250 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  6 05:19:52 np0005548916 nova_compute[228576]: 2025-12-06 10:19:52.276 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:19:52 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  6 05:19:52 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3001520887' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  6 05:19:52 np0005548916 nova_compute[228576]: 2025-12-06 10:19:52.736 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.460s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:19:52 np0005548916 nova_compute[228576]: 2025-12-06 10:19:52.743 228580 DEBUG nova.compute.provider_tree [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Inventory has not changed in ProviderTree for provider: ff2f17cb-ff1d-4da7-9560-4be741380cb1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  6 05:19:52 np0005548916 nova_compute[228576]: 2025-12-06 10:19:52.766 228580 DEBUG nova.scheduler.client.report [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Inventory has not changed for provider ff2f17cb-ff1d-4da7-9560-4be741380cb1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  6 05:19:52 np0005548916 nova_compute[228576]: 2025-12-06 10:19:52.768 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  6 05:19:52 np0005548916 nova_compute[228576]: 2025-12-06 10:19:52.768 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.619s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:19:53 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:19:53 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:19:53 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:19:53.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:19:53 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:19:53 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:19:53 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:19:53.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:19:53 np0005548916 nova_compute[228576]: 2025-12-06 10:19:53.769 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:19:53 np0005548916 nova_compute[228576]: 2025-12-06 10:19:53.770 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:19:53 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:19:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:19:54.295 141446 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:19:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:19:54.296 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:19:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:19:54.296 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:19:54 np0005548916 nova_compute[228576]: 2025-12-06 10:19:54.470 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:19:55 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:19:55 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:19:55 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:19:55.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:19:55 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:19:55 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:19:55 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:19:55.624 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:19:57 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:19:56 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:19:57 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:19:56 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:19:57 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:19:56 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:19:57 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:19:57 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:19:57 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:19:57 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:19:57 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:19:57 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:19:57.626 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:19:57 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:19:57 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:19:57.626 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:19:58 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:19:59 np0005548916 nova_compute[228576]: 2025-12-06 10:19:59.463 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:19:59 np0005548916 nova_compute[228576]: 2025-12-06 10:19:59.470 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:19:59 np0005548916 nova_compute[228576]: 2025-12-06 10:19:59.470 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  6 05:19:59 np0005548916 nova_compute[228576]: 2025-12-06 10:19:59.470 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  6 05:19:59 np0005548916 nova_compute[228576]: 2025-12-06 10:19:59.492 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  6 05:19:59 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:19:59 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:19:59 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:19:59.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:19:59 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:19:59 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:19:59 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:19:59.630 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:20:00 np0005548916 ceph-mon[79770]: overall HEALTH_OK
Dec  6 05:20:00 np0005548916 podman[246194]: 2025-12-06 10:20:00.805314405 +0000 UTC m=+0.076977387 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec  6 05:20:01 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:20:01 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:20:01 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:20:01.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:20:01 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:20:01 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:20:01 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:20:01.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:20:02 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:20:01 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:20:02 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:20:01 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:20:02 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:20:01 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:20:02 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:20:02 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:20:03 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:20:03 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:20:03 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:20:03 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:20:03.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:20:03 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:20:03 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:20:03.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:20:03 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:20:04 np0005548916 podman[246222]: 2025-12-06 10:20:04.750156608 +0000 UTC m=+0.055863715 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec  6 05:20:05 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:20:05 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:20:05 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:20:05 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:20:05 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:20:05.640 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:20:05 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:20:05.640 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:20:07 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:20:06 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:20:07 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:20:06 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:20:07 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:20:06 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:20:07 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:20:07 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:20:07 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:20:07 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:20:07 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:20:07 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:20:07 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:20:07.644 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:20:07 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:20:07.643 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:20:08 np0005548916 podman[246244]: 2025-12-06 10:20:08.75640063 +0000 UTC m=+0.066412615 container health_status ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Dec  6 05:20:08 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:20:09 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:20:09 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:20:09 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:20:09 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:20:09.646 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:20:09 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:20:09 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:20:09.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:20:11 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:20:11 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:20:11 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:20:11 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:20:11 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:20:11.649 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:20:11 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:20:11.649 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:20:12 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:20:11 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:20:12 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:20:11 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:20:12 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:20:11 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:20:12 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:20:12 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:20:12 np0005548916 ceph-mon[79770]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec  6 05:20:12 np0005548916 ceph-mon[79770]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 2400.0 total, 600.0 interval#012Cumulative writes: 6652 writes, 35K keys, 6652 commit groups, 1.0 writes per commit group, ingest: 0.09 GB, 0.04 MB/s#012Cumulative WAL: 6652 writes, 6652 syncs, 1.00 writes per sync, written: 0.09 GB, 0.04 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1546 writes, 8111 keys, 1546 commit groups, 1.0 writes per commit group, ingest: 18.02 MB, 0.03 MB/s#012Interval WAL: 1546 writes, 1546 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0    121.4      0.41              0.18        18    0.023       0      0       0.0       0.0#012  L6      1/0   14.44 MB   0.0      0.3     0.0      0.2       0.2      0.0       0.0   4.5    137.6    119.0      1.92              0.69        17    0.113     94K   9317       0.0       0.0#012 Sum      1/0   14.44 MB   0.0      0.3     0.0      0.2       0.3      0.1       0.0   5.5    113.2    119.4      2.33              0.87        35    0.067     94K   9317       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   5.9    165.6    169.8      0.41              0.18         8    0.052     26K   2592       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.3     0.0      0.2       0.2      0.0       0.0   0.0    137.6    119.0      1.92              0.69        17    0.113     94K   9317       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0    122.0      0.41              0.18        17    0.024       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 2400.0 total, 600.0 interval#012Flush(GB): cumulative 0.049, interval 0.012#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.27 GB write, 0.12 MB/s write, 0.26 GB read, 0.11 MB/s read, 2.3 seconds#012Interval compaction: 0.07 GB write, 0.12 MB/s write, 0.07 GB read, 0.11 MB/s read, 0.4 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55fbbecff350#2 capacity: 304.00 MB usage: 22.89 MB table_size: 0 occupancy: 18446744073709551615 collections: 5 last_copies: 0 last_secs: 0.000468 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(1382,22.14 MB,7.28405%) FilterBlock(35,279.92 KB,0.0899214%) IndexBlock(35,484.39 KB,0.155605%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Dec  6 05:20:13 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:20:13 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:20:13 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:20:13 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:20:13.652 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:20:13 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:20:13 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:20:13.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:20:13 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:20:15 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:20:15 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:20:15 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:20:15.655 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:20:15 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:20:15 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:20:15 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:20:15.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:20:17 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:20:16 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:20:17 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:20:16 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:20:17 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:20:16 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:20:17 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:20:17 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:20:17 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:20:17 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:20:17 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:20:17.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:20:17 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:20:17 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:20:17 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:20:17.659 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:20:18 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:20:19 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:20:19 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:20:19 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:20:19 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:20:19.661 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:20:19 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:20:19 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:20:19.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:20:21 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:20:21 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:20:21 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:20:21.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:20:21 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:20:21 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:20:21 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:20:21.665 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:20:22 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:20:21 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:20:22 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:20:21 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:20:22 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:20:21 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:20:22 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:20:22 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:20:23 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:20:23 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:20:23 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:20:23 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:20:23.667 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:20:23 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:20:23 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:20:23.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:20:23 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:20:25 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:20:25 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:20:25 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:20:25 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:20:25 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:20:25.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:20:25 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:20:25.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:20:27 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:20:27 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:20:27 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:20:27 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:20:27 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:20:27 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:20:27 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:20:27 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:20:27 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:20:27 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:20:27 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:20:27.674 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:20:27 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:20:27 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:20:27 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:20:27.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:20:28 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:20:29 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:20:29 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:20:29 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:20:29.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:20:29 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:20:29 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:20:29 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:20:29.678 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:20:31 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:20:31 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:20:31 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:20:31 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:20:31 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:20:31.680 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:20:31 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:20:31 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:20:31 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:20:31.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:20:31 np0005548916 podman[246381]: 2025-12-06 10:20:31.802025907 +0000 UTC m=+0.097355791 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec  6 05:20:32 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:20:32 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:20:32 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:20:32 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:20:32 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:20:32 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:20:32 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:20:32 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:20:32 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  6 05:20:32 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:20:32 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:20:32 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  6 05:20:33 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:20:33 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:20:33 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:20:33.682 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:20:33 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:20:33 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:20:33 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:20:33.682 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:20:33 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:20:35 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:20:35 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:20:35 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:20:35.684 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:20:35 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:20:35 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:20:35 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:20:35.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:20:35 np0005548916 podman[246410]: 2025-12-06 10:20:35.77193375 +0000 UTC m=+0.070425085 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec  6 05:20:37 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:20:36 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:20:37 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:20:36 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:20:37 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:20:36 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:20:37 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:20:37 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:20:37 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:20:37 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:20:37 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:20:37 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:20:37 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:20:37.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:20:37 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:20:37 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:20:37 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:20:37.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:20:38 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:20:39 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:20:39 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:20:39 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:20:39.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:20:39 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:20:39 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:20:39 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:20:39.691 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:20:39 np0005548916 podman[246482]: 2025-12-06 10:20:39.777117837 +0000 UTC m=+0.088309148 container health_status ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=multipathd)
Dec  6 05:20:41 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:20:41 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:20:41 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:20:41.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:20:41 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:20:41 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:20:41 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:20:41.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:20:42 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:20:41 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:20:42 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:20:41 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:20:42 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:20:41 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:20:42 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:20:42 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:20:43 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:20:43 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:20:43 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:20:43 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:20:43.696 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:20:43 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:20:43 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:20:43.696 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:20:43 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:20:45 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:20:45 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:20:45 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:20:45.699 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:20:45 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:20:45 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:20:45 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:20:45.699 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:20:47 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:20:46 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:20:47 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:20:46 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:20:47 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:20:46 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:20:47 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:20:47 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:20:47 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:20:47 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:20:47 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:20:47 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:20:47 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:20:47.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:20:47 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:20:47.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:20:48 np0005548916 nova_compute[228576]: 2025-12-06 10:20:48.470 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:20:48 np0005548916 nova_compute[228576]: 2025-12-06 10:20:48.471 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Dec  6 05:20:48 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:20:49 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:20:49 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:20:49 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:20:49.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:20:49 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:20:49 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:20:49 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:20:49.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:20:50 np0005548916 nova_compute[228576]: 2025-12-06 10:20:50.471 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:20:50 np0005548916 nova_compute[228576]: 2025-12-06 10:20:50.471 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:20:50 np0005548916 nova_compute[228576]: 2025-12-06 10:20:50.471 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:20:50 np0005548916 nova_compute[228576]: 2025-12-06 10:20:50.471 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  6 05:20:50 np0005548916 nova_compute[228576]: 2025-12-06 10:20:50.471 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:20:51 np0005548916 nova_compute[228576]: 2025-12-06 10:20:51.481 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:20:51 np0005548916 nova_compute[228576]: 2025-12-06 10:20:51.501 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:20:51 np0005548916 nova_compute[228576]: 2025-12-06 10:20:51.501 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:20:51 np0005548916 nova_compute[228576]: 2025-12-06 10:20:51.502 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:20:51 np0005548916 nova_compute[228576]: 2025-12-06 10:20:51.502 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  6 05:20:51 np0005548916 nova_compute[228576]: 2025-12-06 10:20:51.502 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:20:51 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:20:51 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:20:51 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:20:51.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:20:51 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:20:51 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:20:51 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:20:51.708 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:20:51 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  6 05:20:51 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3996973664' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  6 05:20:51 np0005548916 nova_compute[228576]: 2025-12-06 10:20:51.953 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.451s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:20:52 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:20:51 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:20:52 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:20:51 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:20:52 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:20:51 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:20:52 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:20:52 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:20:52 np0005548916 nova_compute[228576]: 2025-12-06 10:20:52.108 228580 WARNING nova.virt.libvirt.driver [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  6 05:20:52 np0005548916 nova_compute[228576]: 2025-12-06 10:20:52.109 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5189MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  6 05:20:52 np0005548916 nova_compute[228576]: 2025-12-06 10:20:52.110 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:20:52 np0005548916 nova_compute[228576]: 2025-12-06 10:20:52.110 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:20:52 np0005548916 nova_compute[228576]: 2025-12-06 10:20:52.404 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  6 05:20:52 np0005548916 nova_compute[228576]: 2025-12-06 10:20:52.405 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  6 05:20:52 np0005548916 nova_compute[228576]: 2025-12-06 10:20:52.463 228580 DEBUG nova.scheduler.client.report [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Refreshing inventories for resource provider ff2f17cb-ff1d-4da7-9560-4be741380cb1 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Dec  6 05:20:52 np0005548916 nova_compute[228576]: 2025-12-06 10:20:52.517 228580 DEBUG nova.scheduler.client.report [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Updating ProviderTree inventory for provider ff2f17cb-ff1d-4da7-9560-4be741380cb1 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Dec  6 05:20:52 np0005548916 nova_compute[228576]: 2025-12-06 10:20:52.518 228580 DEBUG nova.compute.provider_tree [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Updating inventory in ProviderTree for provider ff2f17cb-ff1d-4da7-9560-4be741380cb1 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Dec  6 05:20:52 np0005548916 nova_compute[228576]: 2025-12-06 10:20:52.537 228580 DEBUG nova.scheduler.client.report [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Refreshing aggregate associations for resource provider ff2f17cb-ff1d-4da7-9560-4be741380cb1, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Dec  6 05:20:52 np0005548916 nova_compute[228576]: 2025-12-06 10:20:52.556 228580 DEBUG nova.scheduler.client.report [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Refreshing trait associations for resource provider ff2f17cb-ff1d-4da7-9560-4be741380cb1, traits: COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_ACCELERATORS,HW_CPU_X86_SVM,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_AESNI,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSSE3,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSE4A,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_FMA3,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_AVX,HW_CPU_X86_BMI2,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SHA,HW_CPU_X86_SSE2,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_AMD_SVM,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_F16C,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_ABM,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NODE,COMPUTE_SECURITY_TPM_2_0,COMPUTE_RESCUE_BFV,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SECURITY_TPM_1_2,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_CLMUL,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_AVX2,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_BMI,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_SOCKET_PCI_NUMA_AFFINITY _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Dec  6 05:20:52 np0005548916 nova_compute[228576]: 2025-12-06 10:20:52.573 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:20:52 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  6 05:20:52 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/225600293' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  6 05:20:53 np0005548916 nova_compute[228576]: 2025-12-06 10:20:53.001 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.428s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:20:53 np0005548916 nova_compute[228576]: 2025-12-06 10:20:53.007 228580 DEBUG nova.compute.provider_tree [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Inventory has not changed in ProviderTree for provider: ff2f17cb-ff1d-4da7-9560-4be741380cb1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  6 05:20:53 np0005548916 nova_compute[228576]: 2025-12-06 10:20:53.163 228580 DEBUG nova.scheduler.client.report [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Inventory has not changed for provider ff2f17cb-ff1d-4da7-9560-4be741380cb1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  6 05:20:53 np0005548916 nova_compute[228576]: 2025-12-06 10:20:53.165 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  6 05:20:53 np0005548916 nova_compute[228576]: 2025-12-06 10:20:53.165 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.055s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:20:53 np0005548916 nova_compute[228576]: 2025-12-06 10:20:53.470 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:20:53 np0005548916 nova_compute[228576]: 2025-12-06 10:20:53.471 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:20:53 np0005548916 nova_compute[228576]: 2025-12-06 10:20:53.472 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Dec  6 05:20:53 np0005548916 nova_compute[228576]: 2025-12-06 10:20:53.493 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Dec  6 05:20:53 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:20:53 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:20:53 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:20:53 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:20:53 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:20:53.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:20:53 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:20:53.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:20:53 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:20:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:20:54.296 141446 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:20:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:20:54.296 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:20:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:20:54.297 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:20:54 np0005548916 nova_compute[228576]: 2025-12-06 10:20:54.492 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:20:54 np0005548916 nova_compute[228576]: 2025-12-06 10:20:54.492 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:20:55 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:20:55 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:20:55 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:20:55 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:20:55 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:20:55.713 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:20:55 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:20:55.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:20:56 np0005548916 nova_compute[228576]: 2025-12-06 10:20:56.464 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:20:57 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:20:56 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:20:57 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:20:56 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:20:57 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:20:56 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:20:57 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:20:57 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:20:57 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:20:57 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:20:57 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:20:57 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:20:57.717 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:20:57 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:20:57 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:20:57.717 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:20:58 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:20:59 np0005548916 nova_compute[228576]: 2025-12-06 10:20:59.470 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:20:59 np0005548916 nova_compute[228576]: 2025-12-06 10:20:59.471 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:20:59 np0005548916 nova_compute[228576]: 2025-12-06 10:20:59.471 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  6 05:20:59 np0005548916 nova_compute[228576]: 2025-12-06 10:20:59.471 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  6 05:20:59 np0005548916 nova_compute[228576]: 2025-12-06 10:20:59.501 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  6 05:20:59 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:20:59 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:20:59 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:20:59 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:20:59.719 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:20:59 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:20:59 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:20:59.720 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:21:01 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:21:01 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:21:01 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:21:01.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:21:01 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:21:01 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:21:01 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:21:01.723 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:21:02 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:21:01 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:21:02 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:21:01 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:21:02 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:21:01 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:21:02 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:21:02 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:21:02 np0005548916 podman[246583]: 2025-12-06 10:21:02.79504105 +0000 UTC m=+0.101466224 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  6 05:21:03 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:21:03 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:21:03 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:21:03.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:21:03 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:21:03 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:21:03 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:21:03.726 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:21:03 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:21:05 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:21:05 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:21:05 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:21:05 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:21:05.729 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:21:05 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:21:05 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:21:05.729 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:21:06 np0005548916 podman[246611]: 2025-12-06 10:21:06.766274755 +0000 UTC m=+0.076432443 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible)
Dec  6 05:21:07 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:21:07 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:21:07 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:21:07 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:21:07 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:21:07 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:21:07 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:21:07 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:21:07 np0005548916 ceph-mon[79770]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #64. Immutable memtables: 0.
Dec  6 05:21:07 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:21:07.130300) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec  6 05:21:07 np0005548916 ceph-mon[79770]: rocksdb: [db/flush_job.cc:856] [default] [JOB 37] Flushing memtable with next log file: 64
Dec  6 05:21:07 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016467130510, "job": 37, "event": "flush_started", "num_memtables": 1, "num_entries": 2726, "num_deletes": 508, "total_data_size": 6302447, "memory_usage": 6389056, "flush_reason": "Manual Compaction"}
Dec  6 05:21:07 np0005548916 ceph-mon[79770]: rocksdb: [db/flush_job.cc:885] [default] [JOB 37] Level-0 flush table #65: started
Dec  6 05:21:07 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016467152814, "cf_name": "default", "job": 37, "event": "table_file_creation", "file_number": 65, "file_size": 4104756, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 33801, "largest_seqno": 36522, "table_properties": {"data_size": 4093704, "index_size": 6586, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3397, "raw_key_size": 27198, "raw_average_key_size": 20, "raw_value_size": 4069181, "raw_average_value_size": 3023, "num_data_blocks": 283, "num_entries": 1346, "num_filter_entries": 1346, "num_deletions": 508, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765016261, "oldest_key_time": 1765016261, "file_creation_time": 1765016467, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79d3ff52-4bd2-4fdc-8b55-33dd9c53d8e0", "db_session_id": "1TK25AVRA1WQDS4JHM8T", "orig_file_number": 65, "seqno_to_time_mapping": "N/A"}}
Dec  6 05:21:07 np0005548916 ceph-mon[79770]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 37] Flush lasted 22554 microseconds, and 12195 cpu microseconds.
Dec  6 05:21:07 np0005548916 ceph-mon[79770]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  6 05:21:07 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:21:07.152902) [db/flush_job.cc:967] [default] [JOB 37] Level-0 flush table #65: 4104756 bytes OK
Dec  6 05:21:07 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:21:07.152937) [db/memtable_list.cc:519] [default] Level-0 commit table #65 started
Dec  6 05:21:07 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:21:07.154361) [db/memtable_list.cc:722] [default] Level-0 commit table #65: memtable #1 done
Dec  6 05:21:07 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:21:07.154383) EVENT_LOG_v1 {"time_micros": 1765016467154378, "job": 37, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec  6 05:21:07 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:21:07.154400) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec  6 05:21:07 np0005548916 ceph-mon[79770]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 37] Try to delete WAL files size 6289133, prev total WAL file size 6289133, number of live WAL files 2.
Dec  6 05:21:07 np0005548916 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000061.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 05:21:07 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:21:07.156539) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730032373631' seq:72057594037927935, type:22 .. '7061786F730033303133' seq:0, type:0; will stop at (end)
Dec  6 05:21:07 np0005548916 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 38] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec  6 05:21:07 np0005548916 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 37 Base level 0, inputs: [65(4008KB)], [63(14MB)]
Dec  6 05:21:07 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016467156746, "job": 38, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [65], "files_L6": [63], "score": -1, "input_data_size": 19241345, "oldest_snapshot_seqno": -1}
Dec  6 05:21:07 np0005548916 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 38] Generated table #66: 6841 keys, 17003965 bytes, temperature: kUnknown
Dec  6 05:21:07 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016467261596, "cf_name": "default", "job": 38, "event": "table_file_creation", "file_number": 66, "file_size": 17003965, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 16956293, "index_size": 29448, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 17157, "raw_key_size": 176695, "raw_average_key_size": 25, "raw_value_size": 16831415, "raw_average_value_size": 2460, "num_data_blocks": 1183, "num_entries": 6841, "num_filter_entries": 6841, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765014012, "oldest_key_time": 0, "file_creation_time": 1765016467, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79d3ff52-4bd2-4fdc-8b55-33dd9c53d8e0", "db_session_id": "1TK25AVRA1WQDS4JHM8T", "orig_file_number": 66, "seqno_to_time_mapping": "N/A"}}
Dec  6 05:21:07 np0005548916 ceph-mon[79770]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  6 05:21:07 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:21:07.261932) [db/compaction/compaction_job.cc:1663] [default] [JOB 38] Compacted 1@0 + 1@6 files to L6 => 17003965 bytes
Dec  6 05:21:07 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:21:07.263497) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 183.4 rd, 162.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.9, 14.4 +0.0 blob) out(16.2 +0.0 blob), read-write-amplify(8.8) write-amplify(4.1) OK, records in: 7876, records dropped: 1035 output_compression: NoCompression
Dec  6 05:21:07 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:21:07.263518) EVENT_LOG_v1 {"time_micros": 1765016467263508, "job": 38, "event": "compaction_finished", "compaction_time_micros": 104932, "compaction_time_cpu_micros": 63659, "output_level": 6, "num_output_files": 1, "total_output_size": 17003965, "num_input_records": 7876, "num_output_records": 6841, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec  6 05:21:07 np0005548916 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000065.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 05:21:07 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016467264345, "job": 38, "event": "table_file_deletion", "file_number": 65}
Dec  6 05:21:07 np0005548916 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000063.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 05:21:07 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016467267075, "job": 38, "event": "table_file_deletion", "file_number": 63}
Dec  6 05:21:07 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:21:07.156353) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:21:07 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:21:07.267219) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:21:07 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:21:07.267226) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:21:07 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:21:07.267228) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:21:07 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:21:07.267230) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:21:07 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:21:07.267232) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:21:07 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:21:07 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:21:07 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:21:07 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:21:07.733 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:21:07 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:21:07 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:21:07.733 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:21:08 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:21:09 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:21:09 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:21:09 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:21:09.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:21:09 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:21:09 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:21:09 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:21:09.737 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:21:10 np0005548916 podman[246633]: 2025-12-06 10:21:10.761319292 +0000 UTC m=+0.065958964 container health_status ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Dec  6 05:21:11 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:21:11 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:21:11 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:21:11 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:21:11.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:21:11 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:21:11 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:21:11.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:21:12 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:21:11 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:21:12 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:21:12 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:21:12 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:21:12 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:21:12 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:21:12 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:21:13 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:21:13 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:21:13 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:21:13 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:21:13 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:21:13.742 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:21:13 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:21:13.742 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:21:13 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:21:15 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:21:15 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:21:15 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:21:15.745 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:21:15 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:21:15 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:21:15 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:21:15.746 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:21:17 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:21:16 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:21:17 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:21:16 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:21:17 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:21:16 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:21:17 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:21:17 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:21:17 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:21:17 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:21:17 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:21:17.748 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:21:17 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:21:17 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:21:17 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:21:17.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:21:18 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:21:19 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:21:19 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:21:19 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:21:19.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:21:19 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:21:19 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:21:19 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:21:19.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:21:20 np0005548916 ceph-mon[79770]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #67. Immutable memtables: 0.
Dec  6 05:21:20 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:21:20.029445) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec  6 05:21:20 np0005548916 ceph-mon[79770]: rocksdb: [db/flush_job.cc:856] [default] [JOB 39] Flushing memtable with next log file: 67
Dec  6 05:21:20 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016480029485, "job": 39, "event": "flush_started", "num_memtables": 1, "num_entries": 362, "num_deletes": 251, "total_data_size": 331044, "memory_usage": 338048, "flush_reason": "Manual Compaction"}
Dec  6 05:21:20 np0005548916 ceph-mon[79770]: rocksdb: [db/flush_job.cc:885] [default] [JOB 39] Level-0 flush table #68: started
Dec  6 05:21:20 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016480033526, "cf_name": "default", "job": 39, "event": "table_file_creation", "file_number": 68, "file_size": 217819, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 36527, "largest_seqno": 36884, "table_properties": {"data_size": 215646, "index_size": 337, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 773, "raw_key_size": 5932, "raw_average_key_size": 20, "raw_value_size": 211357, "raw_average_value_size": 721, "num_data_blocks": 15, "num_entries": 293, "num_filter_entries": 293, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765016468, "oldest_key_time": 1765016468, "file_creation_time": 1765016480, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79d3ff52-4bd2-4fdc-8b55-33dd9c53d8e0", "db_session_id": "1TK25AVRA1WQDS4JHM8T", "orig_file_number": 68, "seqno_to_time_mapping": "N/A"}}
Dec  6 05:21:20 np0005548916 ceph-mon[79770]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 39] Flush lasted 4132 microseconds, and 1430 cpu microseconds.
Dec  6 05:21:20 np0005548916 ceph-mon[79770]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  6 05:21:20 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:21:20.033581) [db/flush_job.cc:967] [default] [JOB 39] Level-0 flush table #68: 217819 bytes OK
Dec  6 05:21:20 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:21:20.033595) [db/memtable_list.cc:519] [default] Level-0 commit table #68 started
Dec  6 05:21:20 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:21:20.034617) [db/memtable_list.cc:722] [default] Level-0 commit table #68: memtable #1 done
Dec  6 05:21:20 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:21:20.034635) EVENT_LOG_v1 {"time_micros": 1765016480034630, "job": 39, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec  6 05:21:20 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:21:20.034655) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec  6 05:21:20 np0005548916 ceph-mon[79770]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 39] Try to delete WAL files size 328607, prev total WAL file size 328607, number of live WAL files 2.
Dec  6 05:21:20 np0005548916 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000064.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 05:21:20 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:21:20.035326) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740031303033' seq:72057594037927935, type:22 .. '6D6772737461740031323535' seq:0, type:0; will stop at (end)
Dec  6 05:21:20 np0005548916 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 40] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec  6 05:21:20 np0005548916 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 39 Base level 0, inputs: [68(212KB)], [66(16MB)]
Dec  6 05:21:20 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016480035408, "job": 40, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [68], "files_L6": [66], "score": -1, "input_data_size": 17221784, "oldest_snapshot_seqno": -1}
Dec  6 05:21:20 np0005548916 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 40] Generated table #69: 6624 keys, 13145244 bytes, temperature: kUnknown
Dec  6 05:21:20 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016480097435, "cf_name": "default", "job": 40, "event": "table_file_creation", "file_number": 69, "file_size": 13145244, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13103839, "index_size": 23757, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 16581, "raw_key_size": 172367, "raw_average_key_size": 26, "raw_value_size": 12987337, "raw_average_value_size": 1960, "num_data_blocks": 945, "num_entries": 6624, "num_filter_entries": 6624, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765014012, "oldest_key_time": 0, "file_creation_time": 1765016480, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79d3ff52-4bd2-4fdc-8b55-33dd9c53d8e0", "db_session_id": "1TK25AVRA1WQDS4JHM8T", "orig_file_number": 69, "seqno_to_time_mapping": "N/A"}}
Dec  6 05:21:20 np0005548916 ceph-mon[79770]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  6 05:21:20 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:21:20.097730) [db/compaction/compaction_job.cc:1663] [default] [JOB 40] Compacted 1@0 + 1@6 files to L6 => 13145244 bytes
Dec  6 05:21:20 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:21:20.099125) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 277.2 rd, 211.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.2, 16.2 +0.0 blob) out(12.5 +0.0 blob), read-write-amplify(139.4) write-amplify(60.3) OK, records in: 7134, records dropped: 510 output_compression: NoCompression
Dec  6 05:21:20 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:21:20.099169) EVENT_LOG_v1 {"time_micros": 1765016480099158, "job": 40, "event": "compaction_finished", "compaction_time_micros": 62119, "compaction_time_cpu_micros": 30156, "output_level": 6, "num_output_files": 1, "total_output_size": 13145244, "num_input_records": 7134, "num_output_records": 6624, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec  6 05:21:20 np0005548916 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000068.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 05:21:20 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016480099383, "job": 40, "event": "table_file_deletion", "file_number": 68}
Dec  6 05:21:20 np0005548916 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000066.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 05:21:20 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016480103628, "job": 40, "event": "table_file_deletion", "file_number": 66}
Dec  6 05:21:20 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:21:20.035174) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:21:20 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:21:20.103785) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:21:20 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:21:20.103792) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:21:20 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:21:20.103794) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:21:20 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:21:20.103796) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:21:20 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:21:20.103797) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:21:21 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:21:21 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:21:21 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:21:21.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:21:21 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:21:21 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:21:21 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:21:21.756 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:21:22 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:21:21 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:21:22 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:21:21 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:21:22 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:21:21 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:21:22 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:21:22 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:21:23 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:21:23 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:21:23 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:21:23.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:21:23 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:21:23 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:21:23 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:21:23.758 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:21:23 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:21:25 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:21:25 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:21:25 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:21:25.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:21:25 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:21:25 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:21:25 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:21:25.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:21:27 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:21:26 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:21:27 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:21:26 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:21:27 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:21:26 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:21:27 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:21:27 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:21:27 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:21:27 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:21:27 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:21:27.763 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:21:27 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:21:27 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:21:27 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:21:27.765 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:21:28 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:21:29 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:21:29 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:21:29 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:21:29.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:21:29 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:21:29 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:21:29 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:21:29.768 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:21:31 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:21:31 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:21:31 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:21:31.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:21:31 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:21:31 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:21:31 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:21:31.771 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:21:32 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:21:31 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:21:32 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:21:31 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:21:32 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:21:31 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:21:32 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:21:32 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:21:33 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:21:33 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:21:33 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:21:33.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:21:33 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:21:33 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:21:33 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:21:33.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:21:33 np0005548916 podman[246690]: 2025-12-06 10:21:33.818174169 +0000 UTC m=+0.122261488 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec  6 05:21:34 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:21:34 np0005548916 nova_compute[228576]: 2025-12-06 10:21:34.481 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:21:35 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:21:35 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:21:35 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:21:35.775 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:21:35 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:21:35 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:21:35 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:21:35.778 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:21:37 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:21:36 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:21:37 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:21:37 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:21:37 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:21:37 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:21:37 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:21:37 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:21:37 np0005548916 podman[246826]: 2025-12-06 10:21:37.750307456 +0000 UTC m=+0.057714080 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Dec  6 05:21:37 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:21:37 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:21:37 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:21:37.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:21:37 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:21:37 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:21:37 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:21:37.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:21:38 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  6 05:21:38 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:21:38 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:21:38 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  6 05:21:39 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:21:39 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:21:39 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:21:39 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:21:39.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:21:39 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:21:39 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:21:39 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:21:39.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:21:41 np0005548916 podman[246848]: 2025-12-06 10:21:41.755606075 +0000 UTC m=+0.065081272 container health_status ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible)
Dec  6 05:21:41 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:21:41 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:21:41 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:21:41.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:21:41 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:21:41 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:21:41 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:21:41.786 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:21:42 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:21:41 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:21:42 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:21:41 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:21:42 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:21:41 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:21:42 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:21:42 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:21:43 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:21:43 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:21:43 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:21:43 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:21:43 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:21:43.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:21:43 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:21:43 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:21:43 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:21:43.789 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:21:44 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:21:45 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:21:45 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:21:45 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:21:45.790 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:21:45 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:21:45 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:21:45 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:21:45.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:21:47 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:21:47 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:21:47 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:21:47 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:21:47 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:21:47 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:21:47 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:21:47 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:21:47 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:21:47 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:21:47 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:21:47.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:21:47 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:21:47 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:21:47 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:21:47.796 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:21:49 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:21:49 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:21:49 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:21:49 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:21:49.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:21:49 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:21:49 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:21:49 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:21:49.799 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:21:51 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:21:51 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:21:51 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:21:51.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:21:51 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:21:51 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:21:51 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:21:51.802 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:21:52 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:21:51 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:21:52 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:21:52 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:21:52 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:21:52 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:21:52 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:21:52 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:21:52 np0005548916 nova_compute[228576]: 2025-12-06 10:21:52.469 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:21:52 np0005548916 nova_compute[228576]: 2025-12-06 10:21:52.470 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:21:52 np0005548916 nova_compute[228576]: 2025-12-06 10:21:52.470 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:21:52 np0005548916 nova_compute[228576]: 2025-12-06 10:21:52.470 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  6 05:21:52 np0005548916 nova_compute[228576]: 2025-12-06 10:21:52.470 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:21:52 np0005548916 nova_compute[228576]: 2025-12-06 10:21:52.496 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:21:52 np0005548916 nova_compute[228576]: 2025-12-06 10:21:52.497 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:21:52 np0005548916 nova_compute[228576]: 2025-12-06 10:21:52.497 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:21:52 np0005548916 nova_compute[228576]: 2025-12-06 10:21:52.497 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  6 05:21:52 np0005548916 nova_compute[228576]: 2025-12-06 10:21:52.497 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:21:52 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  6 05:21:52 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3576677501' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  6 05:21:52 np0005548916 nova_compute[228576]: 2025-12-06 10:21:52.937 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.440s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:21:53 np0005548916 nova_compute[228576]: 2025-12-06 10:21:53.098 228580 WARNING nova.virt.libvirt.driver [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  6 05:21:53 np0005548916 nova_compute[228576]: 2025-12-06 10:21:53.100 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5173MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  6 05:21:53 np0005548916 nova_compute[228576]: 2025-12-06 10:21:53.100 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:21:53 np0005548916 nova_compute[228576]: 2025-12-06 10:21:53.101 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:21:53 np0005548916 nova_compute[228576]: 2025-12-06 10:21:53.188 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  6 05:21:53 np0005548916 nova_compute[228576]: 2025-12-06 10:21:53.189 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  6 05:21:53 np0005548916 nova_compute[228576]: 2025-12-06 10:21:53.219 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:21:53 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  6 05:21:53 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1265861855' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  6 05:21:53 np0005548916 nova_compute[228576]: 2025-12-06 10:21:53.640 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.422s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:21:53 np0005548916 nova_compute[228576]: 2025-12-06 10:21:53.645 228580 DEBUG nova.compute.provider_tree [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Inventory has not changed in ProviderTree for provider: ff2f17cb-ff1d-4da7-9560-4be741380cb1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  6 05:21:53 np0005548916 nova_compute[228576]: 2025-12-06 10:21:53.661 228580 DEBUG nova.scheduler.client.report [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Inventory has not changed for provider ff2f17cb-ff1d-4da7-9560-4be741380cb1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  6 05:21:53 np0005548916 nova_compute[228576]: 2025-12-06 10:21:53.663 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  6 05:21:53 np0005548916 nova_compute[228576]: 2025-12-06 10:21:53.663 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.562s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:21:53 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:21:53 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:21:53 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:21:53.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:21:53 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:21:53 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:21:53 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:21:53.804 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:21:54 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:21:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:21:54.297 141446 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:21:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:21:54.297 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:21:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:21:54.297 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:21:55 np0005548916 nova_compute[228576]: 2025-12-06 10:21:55.664 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:21:55 np0005548916 nova_compute[228576]: 2025-12-06 10:21:55.664 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:21:55 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:21:55 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:21:55 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:21:55.805 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:21:55 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:21:55 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:21:55 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:21:55.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:21:56 np0005548916 nova_compute[228576]: 2025-12-06 10:21:56.470 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:21:57 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:21:56 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:21:57 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:21:56 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:21:57 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:21:56 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:21:57 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:21:57 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:21:57 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:21:57 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:21:57 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:21:57.808 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:21:57 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:21:57 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:21:57 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:21:57.809 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:21:59 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:21:59 np0005548916 nova_compute[228576]: 2025-12-06 10:21:59.463 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:21:59 np0005548916 nova_compute[228576]: 2025-12-06 10:21:59.470 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:21:59 np0005548916 nova_compute[228576]: 2025-12-06 10:21:59.470 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  6 05:21:59 np0005548916 nova_compute[228576]: 2025-12-06 10:21:59.470 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  6 05:21:59 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:21:59 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:21:59 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:21:59.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:21:59 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:21:59 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:21:59 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:21:59.812 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:21:59 np0005548916 nova_compute[228576]: 2025-12-06 10:21:59.966 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  6 05:22:01 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:22:01 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:22:01 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:22:01.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:22:01 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:22:01 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:22:01 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:22:01.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:22:02 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:22:01 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:22:02 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:22:01 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:22:02 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:22:01 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:22:02 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:22:02 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:22:03 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:22:03 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:22:03 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:22:03.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:22:03 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:22:03 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:22:03 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:22:03.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:22:04 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:22:04 np0005548916 podman[246975]: 2025-12-06 10:22:04.812237025 +0000 UTC m=+0.122070415 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_controller)
Dec  6 05:22:05 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:22:05 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:22:05 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:22:05 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:22:05.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:22:05 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:22:05 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:22:05.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:22:07 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:22:06 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:22:07 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:22:06 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:22:07 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:22:06 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:22:07 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:22:07 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:22:07 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:22:07 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:22:07 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:22:07.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:22:07 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:22:07 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:22:07 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:22:07.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:22:08 np0005548916 podman[247003]: 2025-12-06 10:22:08.757196846 +0000 UTC m=+0.059276584 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec  6 05:22:09 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:22:09 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:22:09 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:22:09 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:22:09 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:22:09.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:22:09 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:22:09 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:22:09.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:22:11 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:22:11 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:22:11 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:22:11.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:22:11 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:22:11 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:22:11 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:22:11.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:22:12 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:22:11 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:22:12 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:22:11 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:22:12 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:22:11 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:22:12 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:22:12 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:22:12 np0005548916 podman[247026]: 2025-12-06 10:22:12.765971663 +0000 UTC m=+0.062910695 container health_status ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  6 05:22:13 np0005548916 nova_compute[228576]: 2025-12-06 10:22:13.537 228580 DEBUG oslo_concurrency.processutils [None req-1b326720-1719-4a67-9e7f-ab0eb7cb97ad bcb29c3303b24519a22c267aaed79458 3e0ab101ca7547d4a515169a0f2edef3 - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:22:13 np0005548916 nova_compute[228576]: 2025-12-06 10:22:13.570 228580 DEBUG oslo_concurrency.processutils [None req-1b326720-1719-4a67-9e7f-ab0eb7cb97ad bcb29c3303b24519a22c267aaed79458 3e0ab101ca7547d4a515169a0f2edef3 - - default default] CMD "env LANG=C uptime" returned: 0 in 0.033s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:22:13 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:22:13 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:22:13 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:22:13.832 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:22:13 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:22:13 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:22:13 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:22:13.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:22:14 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:22:15 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:22:15 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:22:15 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:22:15.835 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:22:15 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:22:15 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:22:15 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:22:15.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:22:17 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:22:16 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:22:17 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:22:16 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:22:17 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:22:16 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:22:17 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:22:17 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:22:17 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:22:17 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:22:17 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:22:17.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:22:17 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:22:17 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:22:17 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:22:17.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:22:19 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:22:19 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:22:19.801 141446 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=14, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '9a:dc:0d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'b6:0a:c4:b8:be:39'}, ipsec=False) old=SB_Global(nb_cfg=13) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  6 05:22:19 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:22:19.803 141446 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  6 05:22:19 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:22:19 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:22:19 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:22:19 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:22:19.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:22:19 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:22:19 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:22:19.841 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:22:21 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:22:21 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:22:21 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:22:21 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:22:21 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:22:21.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:22:21 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:22:21.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:22:22 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:22:21 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:22:22 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:22:21 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:22:22 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:22:21 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:22:22 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:22:22 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:22:23 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:22:23 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:22:23 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:22:23 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:22:23 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:22:23.849 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:22:23 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:22:23.849 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:22:24 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:22:25 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:22:25.806 141446 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=61eba479-a995-4b31-88b9-8ebfcea9907e, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '14'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  6 05:22:25 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:22:25 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:22:25 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:22:25.851 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:22:25 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:22:25 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:22:25 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:22:25.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:22:27 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:22:27 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:22:27 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:22:27 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:22:27 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:22:27 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:22:27 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:22:27 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:22:27 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:22:27 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:22:27 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:22:27 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:22:27 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:22:27.856 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:22:27 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:22:27.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:22:29 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:22:29 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:22:29 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:22:29 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:22:29 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:22:29 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:22:29.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:22:29 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:22:29.859 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:22:31 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:22:31 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:22:31 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:22:31.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:22:31 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:22:31 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:22:31 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:22:31.862 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:22:32 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:22:31 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:22:32 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:22:32 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:22:32 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:22:32 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:22:32 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:22:32 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:22:33 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:22:33 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:22:33 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:22:33.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:22:33 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:22:33 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:22:33 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:22:33.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:22:34 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:22:35 np0005548916 podman[247083]: 2025-12-06 10:22:35.770542018 +0000 UTC m=+0.077914837 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  6 05:22:35 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:22:35 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:22:35 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:22:35.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:22:35 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:22:35 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:22:35 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:22:35.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:22:37 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:22:36 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:22:37 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:22:36 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:22:37 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:22:36 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:22:37 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:22:37 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:22:37 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:22:37 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:22:37 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:22:37 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:22:37 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:22:37.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:22:37 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:22:37.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:22:39 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:22:39 np0005548916 podman[247136]: 2025-12-06 10:22:39.778879635 +0000 UTC m=+0.083322062 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_metadata_agent)
Dec  6 05:22:39 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:22:39 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:22:39 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:22:39 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:22:39.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:22:39 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:22:39 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:22:39.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:22:41 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:22:41 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:22:41 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:22:41 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:22:41 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:22:41.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:22:41 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:22:41.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:22:42 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:22:41 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:22:42 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:22:41 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:22:42 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:22:41 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:22:42 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:22:42 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:22:42 np0005548916 podman[247208]: 2025-12-06 10:22:42.869284728 +0000 UTC m=+0.064768891 container health_status ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=multipathd, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd)
Dec  6 05:22:43 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  6 05:22:43 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:22:43 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:22:43 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  6 05:22:43 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:22:43 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:22:43 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:22:43.878 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:22:43 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:22:43 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:22:43 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:22:43.879 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:22:44 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:22:45 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:22:45 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:22:45 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:22:45 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:22:45.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:22:45 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:22:45 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:22:45.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:22:47 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:22:46 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:22:47 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:22:46 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:22:47 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:22:46 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:22:47 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:22:47 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:22:47 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:22:47 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:22:47 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:22:47 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:22:47.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:22:47 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:22:47 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:22:47.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:22:48 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:22:48 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:22:49 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:22:49 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:22:49 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:22:49 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:22:49.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:22:49 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:22:49 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:22:49 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:22:49.890 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:22:51 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:22:51 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:22:51 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:22:51.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:22:51 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:22:51 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:22:51 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:22:51.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:22:52 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:22:51 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:22:52 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:22:51 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:22:52 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:22:51 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:22:52 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:22:52 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:22:52 np0005548916 nova_compute[228576]: 2025-12-06 10:22:52.471 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:22:53 np0005548916 nova_compute[228576]: 2025-12-06 10:22:53.471 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:22:53 np0005548916 nova_compute[228576]: 2025-12-06 10:22:53.471 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:22:53 np0005548916 nova_compute[228576]: 2025-12-06 10:22:53.471 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  6 05:22:53 np0005548916 nova_compute[228576]: 2025-12-06 10:22:53.471 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:22:53 np0005548916 nova_compute[228576]: 2025-12-06 10:22:53.496 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:22:53 np0005548916 nova_compute[228576]: 2025-12-06 10:22:53.496 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:22:53 np0005548916 nova_compute[228576]: 2025-12-06 10:22:53.497 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:22:53 np0005548916 nova_compute[228576]: 2025-12-06 10:22:53.497 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  6 05:22:53 np0005548916 nova_compute[228576]: 2025-12-06 10:22:53.497 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:22:53 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:22:53 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:22:53 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:22:53.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:22:53 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:22:53 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:22:53 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:22:53.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:22:53 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  6 05:22:53 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1314528910' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  6 05:22:53 np0005548916 nova_compute[228576]: 2025-12-06 10:22:53.974 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.477s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:22:54 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:22:54 np0005548916 nova_compute[228576]: 2025-12-06 10:22:54.186 228580 WARNING nova.virt.libvirt.driver [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  6 05:22:54 np0005548916 nova_compute[228576]: 2025-12-06 10:22:54.188 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5169MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  6 05:22:54 np0005548916 nova_compute[228576]: 2025-12-06 10:22:54.188 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:22:54 np0005548916 nova_compute[228576]: 2025-12-06 10:22:54.188 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:22:54 np0005548916 nova_compute[228576]: 2025-12-06 10:22:54.272 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  6 05:22:54 np0005548916 nova_compute[228576]: 2025-12-06 10:22:54.272 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  6 05:22:54 np0005548916 nova_compute[228576]: 2025-12-06 10:22:54.289 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:22:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:22:54.298 141446 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:22:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:22:54.300 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:22:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:22:54.300 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:22:54 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  6 05:22:54 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3977870626' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  6 05:22:54 np0005548916 nova_compute[228576]: 2025-12-06 10:22:54.757 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.467s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:22:54 np0005548916 nova_compute[228576]: 2025-12-06 10:22:54.763 228580 DEBUG nova.compute.provider_tree [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Inventory has not changed in ProviderTree for provider: ff2f17cb-ff1d-4da7-9560-4be741380cb1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  6 05:22:54 np0005548916 nova_compute[228576]: 2025-12-06 10:22:54.786 228580 DEBUG nova.scheduler.client.report [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Inventory has not changed for provider ff2f17cb-ff1d-4da7-9560-4be741380cb1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  6 05:22:54 np0005548916 nova_compute[228576]: 2025-12-06 10:22:54.789 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  6 05:22:54 np0005548916 nova_compute[228576]: 2025-12-06 10:22:54.789 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.601s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:22:55 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:22:55 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:22:55 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:22:55.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:22:55 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:22:55 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:22:55 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:22:55.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:22:56 np0005548916 nova_compute[228576]: 2025-12-06 10:22:56.789 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:22:56 np0005548916 nova_compute[228576]: 2025-12-06 10:22:56.789 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:22:57 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:22:56 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:22:57 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:22:56 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:22:57 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:22:56 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:22:57 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:22:57 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:22:57 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:22:57 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:22:57 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:22:57.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:22:57 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:22:57 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:22:57 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:22:57.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:22:58 np0005548916 nova_compute[228576]: 2025-12-06 10:22:58.470 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:22:59 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:22:59 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:22:59 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:22:59 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:22:59.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:22:59 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:22:59 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:22:59 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:22:59.906 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:23:00 np0005548916 nova_compute[228576]: 2025-12-06 10:23:00.464 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:23:01 np0005548916 nova_compute[228576]: 2025-12-06 10:23:01.471 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:23:01 np0005548916 nova_compute[228576]: 2025-12-06 10:23:01.471 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:23:01 np0005548916 nova_compute[228576]: 2025-12-06 10:23:01.472 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  6 05:23:01 np0005548916 nova_compute[228576]: 2025-12-06 10:23:01.472 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  6 05:23:01 np0005548916 nova_compute[228576]: 2025-12-06 10:23:01.507 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  6 05:23:01 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:23:01 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:23:01 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:23:01.906 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:23:01 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:23:01 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:23:01 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:23:01.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:23:02 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:23:01 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:23:02 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:23:01 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:23:02 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:23:01 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:23:02 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:23:02 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:23:03 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:23:03 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:23:03 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:23:03.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:23:03 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:23:03 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:23:03 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:23:03.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:23:04 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:23:05 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:23:05 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:23:05 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:23:05.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:23:05 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:23:05 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:23:05 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:23:05.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:23:06 np0005548916 podman[247367]: 2025-12-06 10:23:06.829349952 +0000 UTC m=+0.129448239 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec  6 05:23:07 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:23:06 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:23:07 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:23:06 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:23:07 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:23:06 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:23:07 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:23:07 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:23:07 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:23:07 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:23:07 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:23:07 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:23:07.917 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:23:07 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:23:07 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:23:07.917 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:23:09 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:23:09 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:23:09 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:23:09 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:23:09 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:23:09.921 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:23:09 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:23:09 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:23:09.921 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:23:10 np0005548916 podman[247396]: 2025-12-06 10:23:10.772033337 +0000 UTC m=+0.077070566 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_metadata_agent)
Dec  6 05:23:11 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:23:11 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:23:11 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:23:11.923 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:23:11 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:23:11 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:23:11 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:23:11.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:23:12 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:23:12 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:23:12 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:23:12 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:23:12 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:23:12 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:23:12 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:23:12 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:23:13 np0005548916 podman[247416]: 2025-12-06 10:23:13.766168427 +0000 UTC m=+0.067659963 container health_status ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team)
Dec  6 05:23:13 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:23:13 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:23:13 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:23:13 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:23:13.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:23:13 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:23:13 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:23:13.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:23:14 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:23:15 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:23:15 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:23:15 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:23:15.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:23:15 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:23:15 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:23:15 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:23:15.929 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:23:17 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:23:16 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:23:17 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:23:16 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:23:17 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:23:16 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:23:17 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:23:17 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:23:17 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:23:17 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:23:17 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:23:17 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:23:17 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:23:17.932 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:23:17 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:23:17.932 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:23:19 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:23:19 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:23:19 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:23:19 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:23:19 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:23:19.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:23:19 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:23:19 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:23:19.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:23:21 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:23:21 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:23:21 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:23:21 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:23:21 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:23:21.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:23:21 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:23:21.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:23:22 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:23:21 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:23:22 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:23:21 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:23:22 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:23:21 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:23:22 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:23:22 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:23:23 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:23:23 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:23:23 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:23:23.941 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:23:23 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:23:23 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:23:23 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:23:23.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:23:24 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:23:25 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:23:25 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:23:25 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:23:25.944 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:23:25 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:23:25 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:23:25 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:23:25.948 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:23:27 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:23:26 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:23:27 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:23:26 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:23:27 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:23:26 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:23:27 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:23:26 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:23:27 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:23:27 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:23:27 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:23:27.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:23:27 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:23:27 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:23:27 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:23:27.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:23:29 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:23:29 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:23:29 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:23:29 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:23:29.950 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:23:29 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:23:29 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:23:29 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:23:29.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:23:31 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:23:31 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:23:31 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:23:31.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:23:31 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:23:31 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:23:31 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:23:31.958 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:23:32 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:23:31 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:23:32 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:23:31 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:23:32 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:23:31 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:23:32 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:23:32 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:23:33 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:23:33 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:23:33 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:23:33.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:23:33 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:23:33 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:23:33 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:23:33.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:23:34 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:23:35 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:23:35 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:23:35 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:23:35 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.002000050s ======
Dec  6 05:23:35 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:23:35.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:23:35 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:23:35.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000050s
Dec  6 05:23:37 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:23:36 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:23:37 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:23:36 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:23:37 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:23:36 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:23:37 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:23:37 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:23:37 np0005548916 podman[247497]: 2025-12-06 10:23:37.385678977 +0000 UTC m=+0.101189056 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec  6 05:23:37 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:23:37 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:23:37 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:23:37 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:23:37.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:23:37 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:23:37 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:23:37.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:23:39 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:23:39 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:23:39 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:23:39 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:23:39 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:23:39.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:23:39 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:23:39 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:23:39.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:23:41 np0005548916 podman[247526]: 2025-12-06 10:23:41.74793509 +0000 UTC m=+0.056478854 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec  6 05:23:41 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:23:41 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:23:41 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:23:41 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:23:41.973 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:23:41 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:23:41 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:23:41.973 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:23:42 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:23:41 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:23:42 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:23:41 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:23:42 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:23:41 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:23:42 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:23:41 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:23:43 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:23:43 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:23:43 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:23:43 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:23:43.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:23:43 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:23:43 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:23:43.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:23:44 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:23:44 np0005548916 podman[247549]: 2025-12-06 10:23:44.795727042 +0000 UTC m=+0.094425128 container health_status ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3)
Dec  6 05:23:45 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:23:45 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:23:45 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:23:45 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:23:45 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:23:45.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:23:45 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:23:45.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:23:47 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:23:46 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:23:47 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:23:46 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:23:47 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:23:46 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:23:47 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:23:47 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:23:47 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:23:47 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:23:47 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:23:47 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:23:47.982 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:23:47 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:23:47 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:23:47.983 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:23:49 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:23:49 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  6 05:23:49 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:23:49 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:23:49 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  6 05:23:49 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:23:49 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:23:49 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:23:49 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:23:49.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:23:49 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:23:49 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:23:49.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:23:50 np0005548916 ceph-mon[79770]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #70. Immutable memtables: 0.
Dec  6 05:23:50 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:23:50.082204) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec  6 05:23:50 np0005548916 ceph-mon[79770]: rocksdb: [db/flush_job.cc:856] [default] [JOB 41] Flushing memtable with next log file: 70
Dec  6 05:23:50 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016630082434, "job": 41, "event": "flush_started", "num_memtables": 1, "num_entries": 1706, "num_deletes": 250, "total_data_size": 4188119, "memory_usage": 4252240, "flush_reason": "Manual Compaction"}
Dec  6 05:23:50 np0005548916 ceph-mon[79770]: rocksdb: [db/flush_job.cc:885] [default] [JOB 41] Level-0 flush table #71: started
Dec  6 05:23:50 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016630106061, "cf_name": "default", "job": 41, "event": "table_file_creation", "file_number": 71, "file_size": 2747624, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 36889, "largest_seqno": 38590, "table_properties": {"data_size": 2740571, "index_size": 4060, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1925, "raw_key_size": 13926, "raw_average_key_size": 18, "raw_value_size": 2726508, "raw_average_value_size": 3669, "num_data_blocks": 178, "num_entries": 743, "num_filter_entries": 743, "num_deletions": 250, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765016481, "oldest_key_time": 1765016481, "file_creation_time": 1765016630, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79d3ff52-4bd2-4fdc-8b55-33dd9c53d8e0", "db_session_id": "1TK25AVRA1WQDS4JHM8T", "orig_file_number": 71, "seqno_to_time_mapping": "N/A"}}
Dec  6 05:23:50 np0005548916 ceph-mon[79770]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 41] Flush lasted 23956 microseconds, and 13178 cpu microseconds.
Dec  6 05:23:50 np0005548916 ceph-mon[79770]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  6 05:23:50 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:23:50.106182) [db/flush_job.cc:967] [default] [JOB 41] Level-0 flush table #71: 2747624 bytes OK
Dec  6 05:23:50 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:23:50.106210) [db/memtable_list.cc:519] [default] Level-0 commit table #71 started
Dec  6 05:23:50 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:23:50.107352) [db/memtable_list.cc:722] [default] Level-0 commit table #71: memtable #1 done
Dec  6 05:23:50 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:23:50.107370) EVENT_LOG_v1 {"time_micros": 1765016630107365, "job": 41, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec  6 05:23:50 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:23:50.107392) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec  6 05:23:50 np0005548916 ceph-mon[79770]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 41] Try to delete WAL files size 4180307, prev total WAL file size 4180307, number of live WAL files 2.
Dec  6 05:23:50 np0005548916 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000067.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 05:23:50 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:23:50.109003) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6B7600323533' seq:72057594037927935, type:22 .. '6B7600353034' seq:0, type:0; will stop at (end)
Dec  6 05:23:50 np0005548916 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 42] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec  6 05:23:50 np0005548916 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 41 Base level 0, inputs: [71(2683KB)], [69(12MB)]
Dec  6 05:23:50 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016630109176, "job": 42, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [71], "files_L6": [69], "score": -1, "input_data_size": 15892868, "oldest_snapshot_seqno": -1}
Dec  6 05:23:50 np0005548916 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 42] Generated table #72: 6853 keys, 14489425 bytes, temperature: kUnknown
Dec  6 05:23:50 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016630191424, "cf_name": "default", "job": 42, "event": "table_file_creation", "file_number": 72, "file_size": 14489425, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 14445125, "index_size": 26076, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 17157, "raw_key_size": 178820, "raw_average_key_size": 26, "raw_value_size": 14323181, "raw_average_value_size": 2090, "num_data_blocks": 1032, "num_entries": 6853, "num_filter_entries": 6853, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765014012, "oldest_key_time": 0, "file_creation_time": 1765016630, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79d3ff52-4bd2-4fdc-8b55-33dd9c53d8e0", "db_session_id": "1TK25AVRA1WQDS4JHM8T", "orig_file_number": 72, "seqno_to_time_mapping": "N/A"}}
Dec  6 05:23:50 np0005548916 ceph-mon[79770]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  6 05:23:50 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:23:50.191670) [db/compaction/compaction_job.cc:1663] [default] [JOB 42] Compacted 1@0 + 1@6 files to L6 => 14489425 bytes
Dec  6 05:23:50 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:23:50.192819) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 193.1 rd, 176.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.6, 12.5 +0.0 blob) out(13.8 +0.0 blob), read-write-amplify(11.1) write-amplify(5.3) OK, records in: 7367, records dropped: 514 output_compression: NoCompression
Dec  6 05:23:50 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:23:50.192838) EVENT_LOG_v1 {"time_micros": 1765016630192829, "job": 42, "event": "compaction_finished", "compaction_time_micros": 82324, "compaction_time_cpu_micros": 44731, "output_level": 6, "num_output_files": 1, "total_output_size": 14489425, "num_input_records": 7367, "num_output_records": 6853, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec  6 05:23:50 np0005548916 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000071.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 05:23:50 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016630193526, "job": 42, "event": "table_file_deletion", "file_number": 71}
Dec  6 05:23:50 np0005548916 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000069.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 05:23:50 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016630195988, "job": 42, "event": "table_file_deletion", "file_number": 69}
Dec  6 05:23:50 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:23:50.108876) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:23:50 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:23:50.196110) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:23:50 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:23:50.196117) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:23:50 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:23:50.196118) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:23:50 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:23:50.196119) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:23:50 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:23:50.196121) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:23:51 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:23:51 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:23:51 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:23:51 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:23:51.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:23:51 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:23:51 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:23:51.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:23:52 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:23:51 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:23:52 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:23:52 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:23:52 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:23:52 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:23:52 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:23:52 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:23:53 np0005548916 nova_compute[228576]: 2025-12-06 10:23:53.470 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:23:53 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:23:53 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:23:53 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:23:53.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:23:53 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:23:53 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:23:53 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:23:53.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:23:54 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:23:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:23:54.299 141446 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:23:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:23:54.299 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:23:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:23:54.299 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:23:54 np0005548916 nova_compute[228576]: 2025-12-06 10:23:54.471 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:23:54 np0005548916 nova_compute[228576]: 2025-12-06 10:23:54.471 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:23:54 np0005548916 nova_compute[228576]: 2025-12-06 10:23:54.471 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  6 05:23:54 np0005548916 nova_compute[228576]: 2025-12-06 10:23:54.472 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:23:54 np0005548916 nova_compute[228576]: 2025-12-06 10:23:54.506 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:23:54 np0005548916 nova_compute[228576]: 2025-12-06 10:23:54.507 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:23:54 np0005548916 nova_compute[228576]: 2025-12-06 10:23:54.507 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:23:54 np0005548916 nova_compute[228576]: 2025-12-06 10:23:54.507 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  6 05:23:54 np0005548916 nova_compute[228576]: 2025-12-06 10:23:54.507 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:23:54 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  6 05:23:54 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3457887194' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  6 05:23:55 np0005548916 nova_compute[228576]: 2025-12-06 10:23:54.999 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.492s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:23:55 np0005548916 nova_compute[228576]: 2025-12-06 10:23:55.157 228580 WARNING nova.virt.libvirt.driver [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  6 05:23:55 np0005548916 nova_compute[228576]: 2025-12-06 10:23:55.158 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5150MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  6 05:23:55 np0005548916 nova_compute[228576]: 2025-12-06 10:23:55.159 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:23:55 np0005548916 nova_compute[228576]: 2025-12-06 10:23:55.159 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:23:55 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:23:55 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:23:55 np0005548916 nova_compute[228576]: 2025-12-06 10:23:55.210 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  6 05:23:55 np0005548916 nova_compute[228576]: 2025-12-06 10:23:55.211 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  6 05:23:55 np0005548916 nova_compute[228576]: 2025-12-06 10:23:55.224 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:23:55 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  6 05:23:55 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1175837351' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  6 05:23:55 np0005548916 nova_compute[228576]: 2025-12-06 10:23:55.667 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.443s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:23:55 np0005548916 nova_compute[228576]: 2025-12-06 10:23:55.674 228580 DEBUG nova.compute.provider_tree [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Inventory has not changed in ProviderTree for provider: ff2f17cb-ff1d-4da7-9560-4be741380cb1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  6 05:23:55 np0005548916 nova_compute[228576]: 2025-12-06 10:23:55.690 228580 DEBUG nova.scheduler.client.report [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Inventory has not changed for provider ff2f17cb-ff1d-4da7-9560-4be741380cb1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  6 05:23:55 np0005548916 nova_compute[228576]: 2025-12-06 10:23:55.691 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  6 05:23:55 np0005548916 nova_compute[228576]: 2025-12-06 10:23:55.691 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.532s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:23:55 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:23:55 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:23:55 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:23:55 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:23:55.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:23:55 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:23:55 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:23:55.994 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:23:57 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:23:56 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:23:57 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:23:56 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:23:57 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:23:56 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:23:57 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:23:57 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:23:58 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:23:58 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:23:58 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:23:58 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:23:57.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:23:58 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:23:58 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:23:57.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:23:58 np0005548916 nova_compute[228576]: 2025-12-06 10:23:58.691 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:23:58 np0005548916 nova_compute[228576]: 2025-12-06 10:23:58.692 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:23:59 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:23:59 np0005548916 nova_compute[228576]: 2025-12-06 10:23:59.470 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:24:00 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:24:00 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:24:00 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:24:00.001 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:24:00 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:24:00 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:24:00 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:24:00.002 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:24:01 np0005548916 nova_compute[228576]: 2025-12-06 10:24:01.463 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:24:01 np0005548916 nova_compute[228576]: 2025-12-06 10:24:01.469 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:24:01 np0005548916 nova_compute[228576]: 2025-12-06 10:24:01.470 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  6 05:24:01 np0005548916 nova_compute[228576]: 2025-12-06 10:24:01.470 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  6 05:24:01 np0005548916 nova_compute[228576]: 2025-12-06 10:24:01.487 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  6 05:24:02 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:24:01 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:24:02 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:24:01 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:24:02 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:24:01 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:24:02 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:24:02 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:24:02 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:24:02 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:24:02 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:24:02 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:24:02.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:24:02 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:24:02 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:24:02.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:24:04 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:24:04 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:24:04 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:24:04 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:24:04.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:24:04 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:24:04 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:24:04.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:24:04 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:24:06 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:24:06 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:24:06 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:24:06 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:24:06 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:24:06.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:24:06 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:24:06.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:24:07 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:24:06 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:24:07 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:24:06 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:24:07 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:24:06 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:24:07 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:24:07 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:24:07 np0005548916 podman[247756]: 2025-12-06 10:24:07.812178153 +0000 UTC m=+0.114363973 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec  6 05:24:08 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:24:08 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:24:08 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:24:08.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:24:08 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:24:08 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:24:08 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:24:08.015 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:24:09 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:24:10 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:24:10 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:24:10 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:24:10 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:24:10 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:24:10.018 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:24:10 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:24:10.018 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:24:12 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:24:11 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:24:12 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:24:11 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:24:12 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:24:11 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:24:12 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:24:12 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:24:12 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:24:12 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:24:12 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:24:12 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:24:12.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:24:12 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:24:12 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:24:12.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:24:12 np0005548916 podman[247785]: 2025-12-06 10:24:12.761184791 +0000 UTC m=+0.063141631 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec  6 05:24:14 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:24:14 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:24:14 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:24:14 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:24:14 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:24:14.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:24:14 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:24:14 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:24:14.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:24:15 np0005548916 podman[247808]: 2025-12-06 10:24:15.774797114 +0000 UTC m=+0.073478748 container health_status ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  6 05:24:16 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:24:16 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:24:16 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:24:16 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:24:16.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:24:16 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:24:16 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:24:16.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:24:17 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:24:16 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:24:17 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:24:16 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:24:17 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:24:16 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:24:17 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:24:17 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:24:18 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:24:18 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:24:18 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:24:18 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:24:18.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:24:18 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:24:18 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:24:18.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:24:19 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:24:20 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:24:20 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:24:20 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:24:20 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:24:20.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:24:20 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:24:20 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:24:20.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:24:22 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:24:21 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:24:22 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:24:21 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:24:22 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:24:21 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:24:22 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:24:22 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:24:22 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:24:22 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:24:22 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:24:22 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:24:22 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:24:22.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:24:22 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:24:22.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:24:24 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:24:24 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:24:24 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:24:24 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:24:24.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:24:24 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:24:24 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:24:24 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:24:24.039 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:24:26 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:24:26 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:24:26 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:24:26.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:24:26 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:24:26 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:24:26 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:24:26.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:24:27 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:24:26 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:24:27 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:24:26 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:24:27 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:24:26 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:24:27 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:24:27 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:24:28 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:24:28 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:24:28 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:24:28.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:24:28 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:24:28 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:24:28 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:24:28.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:24:29 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:24:30 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:24:30 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:24:30 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:24:30.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:24:30 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:24:30 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:24:30 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:24:30.050 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:24:32 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:24:31 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:24:32 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:24:32 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:24:32 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:24:32 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:24:32 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:24:32 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:24:32 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:24:32 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:24:32 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:24:32.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:24:32 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:24:32 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:24:32 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:24:32.053 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:24:34 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:24:34 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:24:34 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:24:34 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:24:34.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:24:34 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:24:34 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:24:34 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:24:34.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:24:36 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:24:36 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:24:36 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:24:36.057 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:24:36 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:24:36 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:24:36 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:24:36.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:24:37 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:24:36 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:24:37 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:24:36 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:24:37 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:24:36 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:24:37 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:24:37 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:24:38 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:24:38 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:24:38 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:24:38.060 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:24:38 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:24:38 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:24:38 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:24:38.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:24:38 np0005548916 podman[247890]: 2025-12-06 10:24:38.828827827 +0000 UTC m=+0.130903854 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3)
Dec  6 05:24:39 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:24:40 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:24:40 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:24:40 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:24:40.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:24:40 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:24:40 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:24:40 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:24:40.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:24:42 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:24:41 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:24:42 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:24:41 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:24:42 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:24:41 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:24:42 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:24:42 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:24:42 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:24:42 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:24:42 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:24:42.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:24:42 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:24:42 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:24:42 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:24:42.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:24:43 np0005548916 ceph-mon[79770]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #73. Immutable memtables: 0.
Dec  6 05:24:43 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:24:43.207397) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec  6 05:24:43 np0005548916 ceph-mon[79770]: rocksdb: [db/flush_job.cc:856] [default] [JOB 43] Flushing memtable with next log file: 73
Dec  6 05:24:43 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016683207501, "job": 43, "event": "flush_started", "num_memtables": 1, "num_entries": 759, "num_deletes": 251, "total_data_size": 1539291, "memory_usage": 1565312, "flush_reason": "Manual Compaction"}
Dec  6 05:24:43 np0005548916 ceph-mon[79770]: rocksdb: [db/flush_job.cc:885] [default] [JOB 43] Level-0 flush table #74: started
Dec  6 05:24:43 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016683215270, "cf_name": "default", "job": 43, "event": "table_file_creation", "file_number": 74, "file_size": 1017152, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 38595, "largest_seqno": 39349, "table_properties": {"data_size": 1013510, "index_size": 1486, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1093, "raw_key_size": 8250, "raw_average_key_size": 19, "raw_value_size": 1006253, "raw_average_value_size": 2362, "num_data_blocks": 65, "num_entries": 426, "num_filter_entries": 426, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765016631, "oldest_key_time": 1765016631, "file_creation_time": 1765016683, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79d3ff52-4bd2-4fdc-8b55-33dd9c53d8e0", "db_session_id": "1TK25AVRA1WQDS4JHM8T", "orig_file_number": 74, "seqno_to_time_mapping": "N/A"}}
Dec  6 05:24:43 np0005548916 ceph-mon[79770]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 43] Flush lasted 7895 microseconds, and 3851 cpu microseconds.
Dec  6 05:24:43 np0005548916 ceph-mon[79770]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  6 05:24:43 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:24:43.215303) [db/flush_job.cc:967] [default] [JOB 43] Level-0 flush table #74: 1017152 bytes OK
Dec  6 05:24:43 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:24:43.215319) [db/memtable_list.cc:519] [default] Level-0 commit table #74 started
Dec  6 05:24:43 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:24:43.216508) [db/memtable_list.cc:722] [default] Level-0 commit table #74: memtable #1 done
Dec  6 05:24:43 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:24:43.216532) EVENT_LOG_v1 {"time_micros": 1765016683216517, "job": 43, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec  6 05:24:43 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:24:43.216547) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec  6 05:24:43 np0005548916 ceph-mon[79770]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 43] Try to delete WAL files size 1535275, prev total WAL file size 1535275, number of live WAL files 2.
Dec  6 05:24:43 np0005548916 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000070.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 05:24:43 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:24:43.217099) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730033303132' seq:72057594037927935, type:22 .. '7061786F730033323634' seq:0, type:0; will stop at (end)
Dec  6 05:24:43 np0005548916 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 44] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec  6 05:24:43 np0005548916 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 43 Base level 0, inputs: [74(993KB)], [72(13MB)]
Dec  6 05:24:43 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016683217309, "job": 44, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [74], "files_L6": [72], "score": -1, "input_data_size": 15506577, "oldest_snapshot_seqno": -1}
Dec  6 05:24:43 np0005548916 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 44] Generated table #75: 6765 keys, 13325700 bytes, temperature: kUnknown
Dec  6 05:24:43 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016683310328, "cf_name": "default", "job": 44, "event": "table_file_creation", "file_number": 75, "file_size": 13325700, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13283096, "index_size": 24572, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 16965, "raw_key_size": 177659, "raw_average_key_size": 26, "raw_value_size": 13163739, "raw_average_value_size": 1945, "num_data_blocks": 963, "num_entries": 6765, "num_filter_entries": 6765, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765014012, "oldest_key_time": 0, "file_creation_time": 1765016683, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79d3ff52-4bd2-4fdc-8b55-33dd9c53d8e0", "db_session_id": "1TK25AVRA1WQDS4JHM8T", "orig_file_number": 75, "seqno_to_time_mapping": "N/A"}}
Dec  6 05:24:43 np0005548916 ceph-mon[79770]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  6 05:24:43 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:24:43.311186) [db/compaction/compaction_job.cc:1663] [default] [JOB 44] Compacted 1@0 + 1@6 files to L6 => 13325700 bytes
Dec  6 05:24:43 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:24:43.313000) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 166.3 rd, 142.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.0, 13.8 +0.0 blob) out(12.7 +0.0 blob), read-write-amplify(28.3) write-amplify(13.1) OK, records in: 7279, records dropped: 514 output_compression: NoCompression
Dec  6 05:24:43 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:24:43.313039) EVENT_LOG_v1 {"time_micros": 1765016683313021, "job": 44, "event": "compaction_finished", "compaction_time_micros": 93230, "compaction_time_cpu_micros": 56891, "output_level": 6, "num_output_files": 1, "total_output_size": 13325700, "num_input_records": 7279, "num_output_records": 6765, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec  6 05:24:43 np0005548916 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000074.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 05:24:43 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016683313631, "job": 44, "event": "table_file_deletion", "file_number": 74}
Dec  6 05:24:43 np0005548916 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000072.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 05:24:43 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016683318386, "job": 44, "event": "table_file_deletion", "file_number": 72}
Dec  6 05:24:43 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:24:43.216988) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:24:43 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:24:43.318467) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:24:43 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:24:43.318475) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:24:43 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:24:43.318478) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:24:43 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:24:43.318481) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:24:43 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:24:43.318484) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:24:43 np0005548916 podman[247921]: 2025-12-06 10:24:43.759421957 +0000 UTC m=+0.061919490 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true)
Dec  6 05:24:44 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:24:44 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:24:44 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:24:44 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:24:44.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:24:44 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:24:44 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:24:44 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:24:44.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:24:46 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:24:46 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:24:46 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:24:46 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:24:46.072 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:24:46 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:24:46 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:24:46.072 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:24:46 np0005548916 podman[247944]: 2025-12-06 10:24:46.756425656 +0000 UTC m=+0.053211303 container health_status ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=multipathd)
Dec  6 05:24:47 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:24:46 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:24:47 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:24:46 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:24:47 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:24:46 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:24:47 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:24:47 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:24:48 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:24:48 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:24:48 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:24:48.074 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:24:48 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:24:48 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:24:48 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:24:48.076 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:24:49 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:24:50 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:24:50 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:24:50 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:24:50.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:24:50 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:24:50 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:24:50 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:24:50.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:24:52 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:24:51 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:24:52 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:24:51 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:24:52 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:24:51 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:24:52 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:24:51 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:24:52 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:24:52 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:24:52 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:24:52.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:24:52 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:24:52 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:24:52 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:24:52.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:24:54 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:24:54 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:24:54 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:24:54 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:24:54.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:24:54 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:24:54 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:24:54 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:24:54.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:24:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:24:54.299 141446 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:24:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:24:54.300 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:24:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:24:54.300 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:24:54 np0005548916 nova_compute[228576]: 2025-12-06 10:24:54.470 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:24:54 np0005548916 nova_compute[228576]: 2025-12-06 10:24:54.471 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:24:54 np0005548916 nova_compute[228576]: 2025-12-06 10:24:54.471 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:24:54 np0005548916 nova_compute[228576]: 2025-12-06 10:24:54.471 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  6 05:24:56 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Dec  6 05:24:56 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  6 05:24:56 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:24:56 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:24:56 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  6 05:24:56 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:24:56 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:24:56 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:24:56.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:24:56 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:24:56 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:24:56 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:24:56.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:24:56 np0005548916 nova_compute[228576]: 2025-12-06 10:24:56.471 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:24:56 np0005548916 nova_compute[228576]: 2025-12-06 10:24:56.501 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:24:56 np0005548916 nova_compute[228576]: 2025-12-06 10:24:56.501 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:24:56 np0005548916 nova_compute[228576]: 2025-12-06 10:24:56.502 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:24:56 np0005548916 nova_compute[228576]: 2025-12-06 10:24:56.502 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  6 05:24:56 np0005548916 nova_compute[228576]: 2025-12-06 10:24:56.502 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:24:56 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  6 05:24:56 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2650253251' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  6 05:24:56 np0005548916 nova_compute[228576]: 2025-12-06 10:24:56.944 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.442s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:24:57 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:24:56 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:24:57 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:24:56 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:24:57 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:24:56 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:24:57 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:24:57 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:24:57 np0005548916 nova_compute[228576]: 2025-12-06 10:24:57.115 228580 WARNING nova.virt.libvirt.driver [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  6 05:24:57 np0005548916 nova_compute[228576]: 2025-12-06 10:24:57.116 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5150MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  6 05:24:57 np0005548916 nova_compute[228576]: 2025-12-06 10:24:57.116 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:24:57 np0005548916 nova_compute[228576]: 2025-12-06 10:24:57.116 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:24:57 np0005548916 nova_compute[228576]: 2025-12-06 10:24:57.482 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  6 05:24:57 np0005548916 nova_compute[228576]: 2025-12-06 10:24:57.482 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  6 05:24:57 np0005548916 nova_compute[228576]: 2025-12-06 10:24:57.503 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:24:57 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  6 05:24:57 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/796350809' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  6 05:24:57 np0005548916 nova_compute[228576]: 2025-12-06 10:24:57.952 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.449s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:24:57 np0005548916 nova_compute[228576]: 2025-12-06 10:24:57.959 228580 DEBUG nova.compute.provider_tree [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Inventory has not changed in ProviderTree for provider: ff2f17cb-ff1d-4da7-9560-4be741380cb1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  6 05:24:57 np0005548916 nova_compute[228576]: 2025-12-06 10:24:57.979 228580 DEBUG nova.scheduler.client.report [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Inventory has not changed for provider ff2f17cb-ff1d-4da7-9560-4be741380cb1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  6 05:24:57 np0005548916 nova_compute[228576]: 2025-12-06 10:24:57.981 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  6 05:24:57 np0005548916 nova_compute[228576]: 2025-12-06 10:24:57.981 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.864s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:24:58 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:24:58 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:24:58 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:24:58.089 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:24:58 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:24:58 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:24:58 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:24:58.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:24:58 np0005548916 nova_compute[228576]: 2025-12-06 10:24:58.981 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:24:59 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:24:59 np0005548916 nova_compute[228576]: 2025-12-06 10:24:59.470 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:24:59 np0005548916 nova_compute[228576]: 2025-12-06 10:24:59.470 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:25:00 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:25:00 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:25:00 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:25:00.092 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:25:00 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:25:00 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:25:00 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:25:00.094 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:25:00 np0005548916 nova_compute[228576]: 2025-12-06 10:25:00.464 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:25:01 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:25:01 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:25:02 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:25:01 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:25:02 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:25:01 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:25:02 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:25:01 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:25:02 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:25:02 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:25:02 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:25:02 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:25:02 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:25:02.096 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:25:02 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:25:02 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:25:02 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:25:02.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:25:03 np0005548916 nova_compute[228576]: 2025-12-06 10:25:03.470 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:25:03 np0005548916 nova_compute[228576]: 2025-12-06 10:25:03.471 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:25:03 np0005548916 nova_compute[228576]: 2025-12-06 10:25:03.472 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  6 05:25:03 np0005548916 nova_compute[228576]: 2025-12-06 10:25:03.472 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  6 05:25:03 np0005548916 nova_compute[228576]: 2025-12-06 10:25:03.489 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  6 05:25:04 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:25:04 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:25:04 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:25:04 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:25:04.098 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:25:04 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:25:04 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:25:04 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:25:04.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:25:06 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:25:06 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:25:06 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:25:06.101 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:25:06 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:25:06 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:25:06 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:25:06.103 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:25:07 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:25:06 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:25:07 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:25:06 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:25:07 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:25:07 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:25:07 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:25:07 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:25:08 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:25:08 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:25:08 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:25:08.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:25:08 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:25:08 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:25:08 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:25:08.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:25:09 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:25:09 np0005548916 podman[248151]: 2025-12-06 10:25:09.824521751 +0000 UTC m=+0.116230539 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec  6 05:25:10 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:25:10 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:25:10 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:25:10.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:25:10 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:25:10 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:25:10 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:25:10.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:25:12 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:25:11 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:25:12 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:25:12 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:25:12 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:25:12 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:25:12 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:25:12 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:25:12 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:25:12 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:25:12 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:25:12.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:25:12 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:25:12 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:25:12 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:25:12.114 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:25:14 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:25:14 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:25:14 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:25:14 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:25:14.112 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:25:14 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:25:14 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:25:14 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:25:14.117 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:25:14 np0005548916 podman[248181]: 2025-12-06 10:25:14.764401892 +0000 UTC m=+0.066749970 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec  6 05:25:16 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:25:16 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:25:16 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:25:16.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:25:16 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:25:16 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:25:16 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:25:16.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:25:17 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:25:16 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:25:17 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:25:16 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:25:17 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:25:16 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:25:17 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:25:17 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:25:17 np0005548916 podman[248206]: 2025-12-06 10:25:17.763264569 +0000 UTC m=+0.068448312 container health_status ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec  6 05:25:18 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:25:18 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:25:18 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:25:18.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:25:18 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:25:18 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:25:18 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:25:18.122 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:25:19 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:25:20 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:25:20 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:25:20 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:25:20.121 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:25:20 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:25:20 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:25:20 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:25:20.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:25:22 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:25:21 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:25:22 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:25:21 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:25:22 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:25:21 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:25:22 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:25:22 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:25:22 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:25:22 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:25:22 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:25:22.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:25:22 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:25:22 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:25:22 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:25:22.128 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:25:24 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:25:24 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:25:24 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:25:24 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:25:24.128 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:25:24 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:25:24 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:25:24 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:25:24.131 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:25:26 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:25:26 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:25:26 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:25:26.131 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:25:26 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:25:26 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:25:26 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:25:26.134 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:25:27 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:25:26 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:25:27 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:25:26 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:25:27 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:25:26 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:25:27 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:25:27 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:25:28 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:25:28 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:25:28 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:25:28.134 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:25:28 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:25:28 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:25:28 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:25:28.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:25:29 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:25:30 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:25:30 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:25:30 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:25:30.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:25:30 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:25:30 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:25:30 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:25:30.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:25:32 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:25:31 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:25:32 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:25:31 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:25:32 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:25:31 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:25:32 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:25:32 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:25:32 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:25:32 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:25:32 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:25:32.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:25:32 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:25:32 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:25:32 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:25:32.143 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:25:34 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:25:34 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:25:34 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:25:34 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:25:34.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:25:34 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:25:34 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:25:34 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:25:34.146 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:25:36 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:25:36 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:25:36 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:25:36.148 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:25:36 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:25:36 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:25:36 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:25:36.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:25:37 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:25:36 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:25:37 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:25:36 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:25:37 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:25:36 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:25:37 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:25:37 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:25:38 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:25:38 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:25:38 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:25:38.151 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:25:38 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:25:38 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:25:38 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:25:38.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:25:39 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:25:40 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:25:40 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:25:40 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:25:40.154 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:25:40 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:25:40 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:25:40 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:25:40.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:25:40 np0005548916 podman[248283]: 2025-12-06 10:25:40.856893893 +0000 UTC m=+0.164209102 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3)
Dec  6 05:25:42 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:25:41 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:25:42 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:25:41 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:25:42 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:25:41 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:25:42 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:25:42 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:25:42 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:25:42 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:25:42 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:25:42.157 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:25:42 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:25:42 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:25:42 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:25:42.159 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:25:44 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:25:44 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:25:44 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:25:44 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:25:44.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:25:44 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:25:44 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:25:44 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:25:44.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:25:45 np0005548916 podman[248310]: 2025-12-06 10:25:45.793881699 +0000 UTC m=+0.092894019 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec  6 05:25:46 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:25:46 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:25:46 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:25:46.163 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:25:46 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:25:46 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:25:46 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:25:46.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:25:47 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:25:46 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:25:47 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:25:46 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:25:47 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:25:46 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:25:47 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:25:47 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:25:48 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:25:48 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:25:48 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:25:48.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:25:48 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:25:48 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:25:48 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:25:48.169 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:25:48 np0005548916 podman[248331]: 2025-12-06 10:25:48.784001829 +0000 UTC m=+0.075512978 container health_status ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec  6 05:25:49 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:25:50 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:25:50 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:25:50 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:25:50.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:25:50 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:25:50 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:25:50 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:25:50.172 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:25:52 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:25:52 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:25:52 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:25:52 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:25:52 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:25:52 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:25:52 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:25:52 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:25:52 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:25:52 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:25:52 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:25:52.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:25:52 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:25:52 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:25:52 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:25:52.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:25:54 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:25:54 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:25:54 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:25:54 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:25:54.176 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:25:54 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:25:54 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:25:54 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:25:54.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:25:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:25:54.300 141446 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:25:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:25:54.301 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:25:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:25:54.301 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:25:54 np0005548916 nova_compute[228576]: 2025-12-06 10:25:54.470 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:25:54 np0005548916 nova_compute[228576]: 2025-12-06 10:25:54.471 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:25:54 np0005548916 nova_compute[228576]: 2025-12-06 10:25:54.471 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  6 05:25:55 np0005548916 nova_compute[228576]: 2025-12-06 10:25:55.471 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:25:56 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:25:56 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:25:56 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:25:56.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:25:56 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:25:56 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:25:56 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:25:56.180 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:25:57 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:25:56 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:25:57 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:25:56 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:25:57 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:25:56 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:25:57 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:25:57 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:25:57 np0005548916 nova_compute[228576]: 2025-12-06 10:25:57.470 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:25:57 np0005548916 nova_compute[228576]: 2025-12-06 10:25:57.471 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:25:57 np0005548916 nova_compute[228576]: 2025-12-06 10:25:57.471 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Dec  6 05:25:57 np0005548916 nova_compute[228576]: 2025-12-06 10:25:57.490 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:25:58 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:25:58 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:25:58 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:25:58.181 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:25:58 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:25:58 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:25:58 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:25:58.183 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:25:58 np0005548916 nova_compute[228576]: 2025-12-06 10:25:58.507 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:25:58 np0005548916 nova_compute[228576]: 2025-12-06 10:25:58.534 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:25:58 np0005548916 nova_compute[228576]: 2025-12-06 10:25:58.534 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:25:58 np0005548916 nova_compute[228576]: 2025-12-06 10:25:58.534 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:25:58 np0005548916 nova_compute[228576]: 2025-12-06 10:25:58.534 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  6 05:25:58 np0005548916 nova_compute[228576]: 2025-12-06 10:25:58.535 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:25:58 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  6 05:25:58 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2411554739' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  6 05:25:59 np0005548916 nova_compute[228576]: 2025-12-06 10:25:59.021 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.486s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:25:59 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:25:59 np0005548916 nova_compute[228576]: 2025-12-06 10:25:59.179 228580 WARNING nova.virt.libvirt.driver [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  6 05:25:59 np0005548916 nova_compute[228576]: 2025-12-06 10:25:59.181 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5181MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  6 05:25:59 np0005548916 nova_compute[228576]: 2025-12-06 10:25:59.181 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:25:59 np0005548916 nova_compute[228576]: 2025-12-06 10:25:59.181 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:25:59 np0005548916 nova_compute[228576]: 2025-12-06 10:25:59.356 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  6 05:25:59 np0005548916 nova_compute[228576]: 2025-12-06 10:25:59.357 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  6 05:25:59 np0005548916 nova_compute[228576]: 2025-12-06 10:25:59.444 228580 DEBUG nova.scheduler.client.report [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Refreshing inventories for resource provider ff2f17cb-ff1d-4da7-9560-4be741380cb1 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Dec  6 05:25:59 np0005548916 nova_compute[228576]: 2025-12-06 10:25:59.528 228580 DEBUG nova.scheduler.client.report [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Updating ProviderTree inventory for provider ff2f17cb-ff1d-4da7-9560-4be741380cb1 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Dec  6 05:25:59 np0005548916 nova_compute[228576]: 2025-12-06 10:25:59.529 228580 DEBUG nova.compute.provider_tree [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Updating inventory in ProviderTree for provider ff2f17cb-ff1d-4da7-9560-4be741380cb1 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Dec  6 05:25:59 np0005548916 nova_compute[228576]: 2025-12-06 10:25:59.553 228580 DEBUG nova.scheduler.client.report [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Refreshing aggregate associations for resource provider ff2f17cb-ff1d-4da7-9560-4be741380cb1, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Dec  6 05:25:59 np0005548916 nova_compute[228576]: 2025-12-06 10:25:59.583 228580 DEBUG nova.scheduler.client.report [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Refreshing trait associations for resource provider ff2f17cb-ff1d-4da7-9560-4be741380cb1, traits: COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_ACCELERATORS,HW_CPU_X86_SVM,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_AESNI,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSSE3,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSE4A,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_FMA3,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_AVX,HW_CPU_X86_BMI2,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SHA,HW_CPU_X86_SSE2,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_AMD_SVM,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_F16C,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_ABM,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NODE,COMPUTE_SECURITY_TPM_2_0,COMPUTE_RESCUE_BFV,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SECURITY_TPM_1_2,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_CLMUL,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_AVX2,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_BMI,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_SOCKET_PCI_NUMA_AFFINITY _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Dec  6 05:25:59 np0005548916 nova_compute[228576]: 2025-12-06 10:25:59.605 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:26:00 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  6 05:26:00 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3387854361' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  6 05:26:00 np0005548916 nova_compute[228576]: 2025-12-06 10:26:00.061 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.456s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:26:00 np0005548916 nova_compute[228576]: 2025-12-06 10:26:00.069 228580 DEBUG nova.compute.provider_tree [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Inventory has not changed in ProviderTree for provider: ff2f17cb-ff1d-4da7-9560-4be741380cb1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  6 05:26:00 np0005548916 nova_compute[228576]: 2025-12-06 10:26:00.102 228580 DEBUG nova.scheduler.client.report [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Inventory has not changed for provider ff2f17cb-ff1d-4da7-9560-4be741380cb1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  6 05:26:00 np0005548916 nova_compute[228576]: 2025-12-06 10:26:00.104 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  6 05:26:00 np0005548916 nova_compute[228576]: 2025-12-06 10:26:00.105 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.923s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:26:00 np0005548916 nova_compute[228576]: 2025-12-06 10:26:00.105 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:26:00 np0005548916 nova_compute[228576]: 2025-12-06 10:26:00.106 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Dec  6 05:26:00 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:26:00 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:26:00 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:26:00 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:26:00.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:26:00 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:26:00 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:26:00.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:26:00 np0005548916 nova_compute[228576]: 2025-12-06 10:26:00.257 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Dec  6 05:26:01 np0005548916 podman[248550]: 2025-12-06 10:26:01.881593518 +0000 UTC m=+0.059102930 container exec 500f8c89b5c281d0ace139835b07ea5bc6259ce3e028d8284d57da97424b55e0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-crash-compute-1, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  6 05:26:01 np0005548916 podman[248550]: 2025-12-06 10:26:01.98869579 +0000 UTC m=+0.166205162 container exec_died 500f8c89b5c281d0ace139835b07ea5bc6259ce3e028d8284d57da97424b55e0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-crash-compute-1, CEPH_REF=squid, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325)
Dec  6 05:26:02 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:26:01 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:26:02 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:26:02 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:26:02 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:26:02 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:26:02 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:26:02 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:26:02 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:26:02 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:26:02 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:26:02.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:26:02 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:26:02 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:26:02 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:26:02.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:26:02 np0005548916 nova_compute[228576]: 2025-12-06 10:26:02.220 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:26:02 np0005548916 nova_compute[228576]: 2025-12-06 10:26:02.220 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:26:02 np0005548916 podman[248665]: 2025-12-06 10:26:02.40951755 +0000 UTC m=+0.046858166 container exec 6af22af7046e22bedbb2fb280e4d2c530c5b3cac3959f396bf7fe3d14752a7eb (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec  6 05:26:02 np0005548916 podman[248665]: 2025-12-06 10:26:02.41757446 +0000 UTC m=+0.054915096 container exec_died 6af22af7046e22bedbb2fb280e4d2c530c5b3cac3959f396bf7fe3d14752a7eb (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec  6 05:26:02 np0005548916 podman[248759]: 2025-12-06 10:26:02.708660175 +0000 UTC m=+0.051594034 container exec 044fb2629765feb8ffd5fd258951cd4533635db83b13cd8de7feeb48e81aeb97 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Dec  6 05:26:02 np0005548916 podman[248759]: 2025-12-06 10:26:02.721465473 +0000 UTC m=+0.064399302 container exec_died 044fb2629765feb8ffd5fd258951cd4533635db83b13cd8de7feeb48e81aeb97 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  6 05:26:02 np0005548916 podman[248823]: 2025-12-06 10:26:02.911756833 +0000 UTC m=+0.048593429 container exec 70891cd2190622057f9c45299e27938f7b2105f0244eda3658dedfb18fed50f0 (image=quay.io/ceph/haproxy:2.3, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd)
Dec  6 05:26:02 np0005548916 podman[248823]: 2025-12-06 10:26:02.92450741 +0000 UTC m=+0.061344006 container exec_died 70891cd2190622057f9c45299e27938f7b2105f0244eda3658dedfb18fed50f0 (image=quay.io/ceph/haproxy:2.3, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd)
Dec  6 05:26:03 np0005548916 podman[248890]: 2025-12-06 10:26:03.113020715 +0000 UTC m=+0.046926287 container exec c8ec7212805c01399bc295ce2c5e69b11fbde393e887859b5ab336e81cd6d1f1 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-1-uzbtlt, io.buildah.version=1.28.2, distribution-scope=public, architecture=x86_64, name=keepalived, build-date=2023-02-22T09:23:20, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, version=2.2.4, description=keepalived for Ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides keepalived on RHEL 9 for Ceph., release=1793, vcs-type=git, com.redhat.component=keepalived-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Keepalived on RHEL 9, io.openshift.tags=Ceph keepalived, vendor=Red Hat, Inc.)
Dec  6 05:26:03 np0005548916 podman[248890]: 2025-12-06 10:26:03.127502105 +0000 UTC m=+0.061407677 container exec_died c8ec7212805c01399bc295ce2c5e69b11fbde393e887859b5ab336e81cd6d1f1 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-1-uzbtlt, com.redhat.component=keepalived-container, io.buildah.version=1.28.2, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, vcs-type=git, vendor=Red Hat, Inc., build-date=2023-02-22T09:23:20, distribution-scope=public, architecture=x86_64, io.openshift.expose-services=, version=2.2.4, name=keepalived, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1793, io.k8s.display-name=Keepalived on RHEL 9, io.openshift.tags=Ceph keepalived, description=keepalived for Ceph, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides keepalived on RHEL 9 for Ceph., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793)
Dec  6 05:26:03 np0005548916 nova_compute[228576]: 2025-12-06 10:26:03.464 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:26:04 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:26:04 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:26:04 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:26:04 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:26:04 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:26:04.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:26:04 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:26:04 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:26:04.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:26:04 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:26:04 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:26:04 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:26:04 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:26:04 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Dec  6 05:26:05 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Dec  6 05:26:05 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  6 05:26:05 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:26:05 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:26:05 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  6 05:26:05 np0005548916 ceph-mon[79770]: Health check failed: 1 failed cephadm daemon(s) (CEPHADM_FAILED_DAEMON)
Dec  6 05:26:05 np0005548916 nova_compute[228576]: 2025-12-06 10:26:05.470 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:26:05 np0005548916 nova_compute[228576]: 2025-12-06 10:26:05.471 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  6 05:26:05 np0005548916 nova_compute[228576]: 2025-12-06 10:26:05.471 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  6 05:26:05 np0005548916 nova_compute[228576]: 2025-12-06 10:26:05.517 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  6 05:26:06 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:26:06 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:26:06 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:26:06.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:26:06 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:26:06 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:26:06 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:26:06.196 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:26:07 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:26:06 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:26:07 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:26:06 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:26:07 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:26:06 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:26:07 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:26:07 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:26:08 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:26:08 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:26:08 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:26:08.198 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:26:08 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:26:08 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:26:08 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:26:08.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:26:09 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:26:10 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:26:10 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:26:10 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:26:10 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:26:10 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:26:10 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:26:10 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:26:10.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:26:10 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:26:10 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:26:10.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:26:11 np0005548916 podman[249030]: 2025-12-06 10:26:11.830686432 +0000 UTC m=+0.133222423 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Dec  6 05:26:12 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:26:11 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:26:12 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:26:11 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:26:12 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:26:11 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:26:12 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:26:12 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:26:12 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:26:12 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:26:12 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:26:12 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:26:12 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:26:12.206 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:26:12 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:26:12.206 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:26:14 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:26:14 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:26:14 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:26:14 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:26:14 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:26:14 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:26:14.209 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:26:14 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:26:14.209 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:26:16 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:26:16 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:26:16 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:26:16 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:26:16.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:26:16 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:26:16 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:26:16.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:26:16 np0005548916 podman[249059]: 2025-12-06 10:26:16.76044385 +0000 UTC m=+0.065659993 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible)
Dec  6 05:26:17 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:26:16 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:26:17 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:26:16 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:26:17 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:26:16 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:26:17 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:26:17 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:26:18 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:26:18 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:26:18 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:26:18.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:26:18 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:26:18 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:26:18 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:26:18.216 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:26:19 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:26:19 np0005548916 podman[249106]: 2025-12-06 10:26:19.75104411 +0000 UTC m=+0.058086675 container health_status ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Dec  6 05:26:20 np0005548916 ceph-mon[79770]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #76. Immutable memtables: 0.
Dec  6 05:26:20 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:26:20.129506) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec  6 05:26:20 np0005548916 ceph-mon[79770]: rocksdb: [db/flush_job.cc:856] [default] [JOB 45] Flushing memtable with next log file: 76
Dec  6 05:26:20 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016780129659, "job": 45, "event": "flush_started", "num_memtables": 1, "num_entries": 1286, "num_deletes": 255, "total_data_size": 2971840, "memory_usage": 3005568, "flush_reason": "Manual Compaction"}
Dec  6 05:26:20 np0005548916 ceph-mon[79770]: rocksdb: [db/flush_job.cc:885] [default] [JOB 45] Level-0 flush table #77: started
Dec  6 05:26:20 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016780142596, "cf_name": "default", "job": 45, "event": "table_file_creation", "file_number": 77, "file_size": 1942990, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 39354, "largest_seqno": 40635, "table_properties": {"data_size": 1937422, "index_size": 2900, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1605, "raw_key_size": 12264, "raw_average_key_size": 19, "raw_value_size": 1926054, "raw_average_value_size": 3116, "num_data_blocks": 125, "num_entries": 618, "num_filter_entries": 618, "num_deletions": 255, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765016684, "oldest_key_time": 1765016684, "file_creation_time": 1765016780, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79d3ff52-4bd2-4fdc-8b55-33dd9c53d8e0", "db_session_id": "1TK25AVRA1WQDS4JHM8T", "orig_file_number": 77, "seqno_to_time_mapping": "N/A"}}
Dec  6 05:26:20 np0005548916 ceph-mon[79770]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 45] Flush lasted 13163 microseconds, and 6658 cpu microseconds.
Dec  6 05:26:20 np0005548916 ceph-mon[79770]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  6 05:26:20 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:26:20.142688) [db/flush_job.cc:967] [default] [JOB 45] Level-0 flush table #77: 1942990 bytes OK
Dec  6 05:26:20 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:26:20.142708) [db/memtable_list.cc:519] [default] Level-0 commit table #77 started
Dec  6 05:26:20 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:26:20.144111) [db/memtable_list.cc:722] [default] Level-0 commit table #77: memtable #1 done
Dec  6 05:26:20 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:26:20.144124) EVENT_LOG_v1 {"time_micros": 1765016780144120, "job": 45, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec  6 05:26:20 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:26:20.144161) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec  6 05:26:20 np0005548916 ceph-mon[79770]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 45] Try to delete WAL files size 2965655, prev total WAL file size 2965655, number of live WAL files 2.
Dec  6 05:26:20 np0005548916 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000073.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 05:26:20 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:26:20.145201) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0031303035' seq:72057594037927935, type:22 .. '6C6F676D0031323536' seq:0, type:0; will stop at (end)
Dec  6 05:26:20 np0005548916 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 46] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec  6 05:26:20 np0005548916 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 45 Base level 0, inputs: [77(1897KB)], [75(12MB)]
Dec  6 05:26:20 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016780145342, "job": 46, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [77], "files_L6": [75], "score": -1, "input_data_size": 15268690, "oldest_snapshot_seqno": -1}
Dec  6 05:26:20 np0005548916 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 46] Generated table #78: 6854 keys, 15100115 bytes, temperature: kUnknown
Dec  6 05:26:20 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016780218418, "cf_name": "default", "job": 46, "event": "table_file_creation", "file_number": 78, "file_size": 15100115, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 15055066, "index_size": 26825, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 17157, "raw_key_size": 180460, "raw_average_key_size": 26, "raw_value_size": 14932341, "raw_average_value_size": 2178, "num_data_blocks": 1056, "num_entries": 6854, "num_filter_entries": 6854, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765014012, "oldest_key_time": 0, "file_creation_time": 1765016780, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79d3ff52-4bd2-4fdc-8b55-33dd9c53d8e0", "db_session_id": "1TK25AVRA1WQDS4JHM8T", "orig_file_number": 78, "seqno_to_time_mapping": "N/A"}}
Dec  6 05:26:20 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:26:20 np0005548916 ceph-mon[79770]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  6 05:26:20 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:26:20 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:26:20.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:26:20 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:26:20.218726) [db/compaction/compaction_job.cc:1663] [default] [JOB 46] Compacted 1@0 + 1@6 files to L6 => 15100115 bytes
Dec  6 05:26:20 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:26:20.220231) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 208.7 rd, 206.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.9, 12.7 +0.0 blob) out(14.4 +0.0 blob), read-write-amplify(15.6) write-amplify(7.8) OK, records in: 7383, records dropped: 529 output_compression: NoCompression
Dec  6 05:26:20 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:26:20.220255) EVENT_LOG_v1 {"time_micros": 1765016780220244, "job": 46, "event": "compaction_finished", "compaction_time_micros": 73167, "compaction_time_cpu_micros": 33140, "output_level": 6, "num_output_files": 1, "total_output_size": 15100115, "num_input_records": 7383, "num_output_records": 6854, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec  6 05:26:20 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:26:20 np0005548916 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000077.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 05:26:20 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016780220958, "job": 46, "event": "table_file_deletion", "file_number": 77}
Dec  6 05:26:20 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:26:20 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:26:20.219 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:26:20 np0005548916 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000075.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 05:26:20 np0005548916 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016780224572, "job": 46, "event": "table_file_deletion", "file_number": 75}
Dec  6 05:26:20 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:26:20.145116) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:26:20 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:26:20.224686) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:26:20 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:26:20.224694) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:26:20 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:26:20.224696) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:26:20 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:26:20.224698) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:26:20 np0005548916 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:26:20.224701) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:26:22 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:26:21 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:26:22 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:26:21 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:26:22 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:26:21 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:26:22 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:26:22 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:26:22 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:26:22 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec  6 05:26:22 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:26:22 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:26:22.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:26:22 np0005548916 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:26:22 np0005548916 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:26:22.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:26:24 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:26:24 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:26:24 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:26:24 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:26:24.706 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:26:24 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:26:24 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:26:24 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:26:24.708 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:26:26 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:26:26 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:26:26 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:26:26.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:26:26 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:26:26 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:26:26 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:26:26.711 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:26:27 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:26:26 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:26:27 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:26:26 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:26:27 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:26:26 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:26:27 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:26:27 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:26:28 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:26:28 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:26:28 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:26:28.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:26:28 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:26:28 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:26:28 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:26:28.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:26:29 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:26:30 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:26:30 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:26:30 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:26:30.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:26:30 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:26:30 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:26:30 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:26:30.719 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:26:32 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:26:32 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:26:32 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:26:32 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:26:32 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:26:32 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:26:32 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:26:32 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:26:32 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:26:32 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:26:32 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:26:32.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:26:32 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:26:32 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:26:32 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:26:32.723 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:26:34 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:26:34 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:26:34 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:26:34 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:26:34.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:26:34 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:26:34 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:26:34 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:26:34.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:26:36 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:26:36 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:26:36 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:26:36.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:26:36 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:26:36 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:26:36 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:26:36.729 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:26:37 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:26:36 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:26:37 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:26:36 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:26:37 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:26:36 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:26:37 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:26:37 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:26:38 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:26:38 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:26:38 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:26:38.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:26:38 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:26:38 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:26:38 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:26:38.732 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:26:39 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:26:40 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:26:40 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:26:40 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:26:40.729 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:26:40 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:26:40 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:26:40 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:26:40.735 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:26:42 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:26:41 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:26:42 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:26:41 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:26:42 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:26:41 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:26:42 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:26:42 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:26:42 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:26:42 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:26:42 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:26:42.731 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:26:42 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:26:42 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:26:42 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:26:42.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:26:42 np0005548916 podman[249164]: 2025-12-06 10:26:42.796414978 +0000 UTC m=+0.093712290 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  6 05:26:44 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:26:44 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:26:44 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:26:44 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:26:44.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:26:44 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:26:44 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:26:44 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:26:44.741 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:26:46 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:26:46 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:26:46 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:26:46.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:26:46 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:26:46 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:26:46 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:26:46.745 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:26:47 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:26:46 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:26:47 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:26:46 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:26:47 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:26:46 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:26:47 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:26:47 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:26:47 np0005548916 podman[249192]: 2025-12-06 10:26:47.794597177 +0000 UTC m=+0.095510855 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec  6 05:26:48 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:26:48 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:26:48 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:26:48.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:26:48 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:26:48 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:26:48 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:26:48.748 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:26:49 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:26:50 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:26:50 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:26:50 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:26:50.741 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:26:50 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:26:50 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:26:50 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:26:50.752 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:26:50 np0005548916 podman[249213]: 2025-12-06 10:26:50.767130238 +0000 UTC m=+0.071490098 container health_status ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Dec  6 05:26:52 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:26:51 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:26:52 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:26:51 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:26:52 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:26:51 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:26:52 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:26:52 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:26:52 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:26:52 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:26:52 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:26:52.745 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:26:52 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:26:52 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:26:52 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:26:52.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:26:54 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:26:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:26:54.302 141446 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:26:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:26:54.303 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:26:54 np0005548916 ovn_metadata_agent[141441]: 2025-12-06 10:26:54.303 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:26:54 np0005548916 nova_compute[228576]: 2025-12-06 10:26:54.471 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:26:54 np0005548916 nova_compute[228576]: 2025-12-06 10:26:54.471 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:26:54 np0005548916 nova_compute[228576]: 2025-12-06 10:26:54.472 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  6 05:26:54 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:26:54 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:26:54 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:26:54.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:26:54 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:26:54 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:26:54 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:26:54.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:26:56 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:26:56 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:26:56 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:26:56.750 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:26:56 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:26:56 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:26:56 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:26:56.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:26:57 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:26:56 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:26:57 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:26:56 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:26:57 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:26:56 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:26:57 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:26:57 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:26:57 np0005548916 nova_compute[228576]: 2025-12-06 10:26:57.472 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:26:57 np0005548916 nova_compute[228576]: 2025-12-06 10:26:57.472 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:26:58 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:26:58 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:26:58 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:26:58.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:26:58 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:26:58 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:26:58 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:26:58.765 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:26:59 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:26:59 np0005548916 nova_compute[228576]: 2025-12-06 10:26:59.470 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:26:59 np0005548916 nova_compute[228576]: 2025-12-06 10:26:59.494 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:26:59 np0005548916 nova_compute[228576]: 2025-12-06 10:26:59.495 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:26:59 np0005548916 nova_compute[228576]: 2025-12-06 10:26:59.495 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:26:59 np0005548916 nova_compute[228576]: 2025-12-06 10:26:59.495 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  6 05:26:59 np0005548916 nova_compute[228576]: 2025-12-06 10:26:59.495 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:26:59 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  6 05:26:59 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2022403802' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  6 05:26:59 np0005548916 nova_compute[228576]: 2025-12-06 10:26:59.951 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.456s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:27:00 np0005548916 nova_compute[228576]: 2025-12-06 10:27:00.102 228580 WARNING nova.virt.libvirt.driver [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  6 05:27:00 np0005548916 nova_compute[228576]: 2025-12-06 10:27:00.104 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5169MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  6 05:27:00 np0005548916 nova_compute[228576]: 2025-12-06 10:27:00.104 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:27:00 np0005548916 nova_compute[228576]: 2025-12-06 10:27:00.104 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:27:00 np0005548916 nova_compute[228576]: 2025-12-06 10:27:00.179 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  6 05:27:00 np0005548916 nova_compute[228576]: 2025-12-06 10:27:00.179 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  6 05:27:00 np0005548916 nova_compute[228576]: 2025-12-06 10:27:00.221 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:27:00 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  6 05:27:00 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3629784107' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  6 05:27:00 np0005548916 nova_compute[228576]: 2025-12-06 10:27:00.644 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.424s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:27:00 np0005548916 nova_compute[228576]: 2025-12-06 10:27:00.651 228580 DEBUG nova.compute.provider_tree [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Inventory has not changed in ProviderTree for provider: ff2f17cb-ff1d-4da7-9560-4be741380cb1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  6 05:27:00 np0005548916 nova_compute[228576]: 2025-12-06 10:27:00.675 228580 DEBUG nova.scheduler.client.report [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Inventory has not changed for provider ff2f17cb-ff1d-4da7-9560-4be741380cb1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  6 05:27:00 np0005548916 nova_compute[228576]: 2025-12-06 10:27:00.676 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  6 05:27:00 np0005548916 nova_compute[228576]: 2025-12-06 10:27:00.676 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.572s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:27:00 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:27:00 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:27:00 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:27:00.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:27:00 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:27:00 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:27:00 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:27:00.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:27:01 np0005548916 nova_compute[228576]: 2025-12-06 10:27:01.670 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:27:02 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:27:01 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:27:02 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:27:01 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:27:02 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:27:01 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:27:02 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:27:02 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:27:02 np0005548916 nova_compute[228576]: 2025-12-06 10:27:02.470 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:27:02 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:27:02 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:27:02 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:27:02.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:27:02 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:27:02 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:27:02 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:27:02.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:27:03 np0005548916 nova_compute[228576]: 2025-12-06 10:27:03.471 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:27:04 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:27:04 np0005548916 nova_compute[228576]: 2025-12-06 10:27:04.464 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:27:04 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:27:04 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:27:04 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:27:04.763 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:27:04 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:27:04 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:27:04 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:27:04.775 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:27:06 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:27:06 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:27:06 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:27:06.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:27:06 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:27:06 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:27:06 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:27:06.778 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:27:07 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:27:06 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:27:07 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:27:06 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:27:07 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:27:06 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:27:07 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:27:07 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:27:07 np0005548916 nova_compute[228576]: 2025-12-06 10:27:07.471 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:27:07 np0005548916 nova_compute[228576]: 2025-12-06 10:27:07.471 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  6 05:27:07 np0005548916 nova_compute[228576]: 2025-12-06 10:27:07.471 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  6 05:27:07 np0005548916 nova_compute[228576]: 2025-12-06 10:27:07.494 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  6 05:27:08 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:27:08 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:27:08 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:27:08.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:27:08 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:27:08 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:27:08 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:27:08.782 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:27:09 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:27:10 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:27:10 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 05:27:10 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:27:10.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 05:27:10 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:27:10 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:27:10 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:27:10.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:27:12 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:27:11 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:27:12 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:27:12 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:27:12 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:27:12 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:27:12 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:27:12 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:27:12 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:27:12 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:27:12 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:27:12.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:27:12 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:27:12 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:27:12 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:27:12.788 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:27:13 np0005548916 podman[249396]: 2025-12-06 10:27:13.841466657 +0000 UTC m=+0.143373755 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec  6 05:27:14 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:27:14 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:27:14 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:27:14 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:27:14.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:27:14 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:27:14 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:27:14 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:27:14.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:27:14 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:27:14 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:27:14 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  6 05:27:14 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:27:14 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:27:14 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  6 05:27:15 np0005548916 ceph-mon[79770]: Health check update: 2 failed cephadm daemon(s) (CEPHADM_FAILED_DAEMON)
Dec  6 05:27:15 np0005548916 systemd-logind[788]: New session 58 of user zuul.
Dec  6 05:27:16 np0005548916 systemd[1]: Started Session 58 of User zuul.
Dec  6 05:27:16 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:27:16 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:27:16 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:27:16.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:27:16 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:27:16 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:27:16 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:27:16.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:27:17 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:27:16 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:27:17 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:27:16 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:27:17 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:27:16 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:27:17 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:27:17 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:27:18 np0005548916 podman[249604]: 2025-12-06 10:27:18.31430132 +0000 UTC m=+0.060803493 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec  6 05:27:18 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:27:18 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:27:18 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:27:18.784 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:27:18 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:27:18 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:27:18 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:27:18.796 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:27:19 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:27:19 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "status"} v 0)
Dec  6 05:27:19 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/725242139' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Dec  6 05:27:20 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:27:20 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:27:20 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:27:20.788 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:27:20 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:27:20 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:27:20 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:27:20.802 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:27:20 np0005548916 podman[249787]: 2025-12-06 10:27:20.946960084 +0000 UTC m=+0.067006337 container health_status ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=multipathd, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Dec  6 05:27:21 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:27:21 np0005548916 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:27:22 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:27:21 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:27:22 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:27:21 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:27:22 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:27:21 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:27:22 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:27:22 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:27:22 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:27:22 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:27:22 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:27:22.790 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:27:22 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:27:22 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:27:22 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:27:22.805 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:27:22 np0005548916 ovs-vsctl[249841]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Dec  6 05:27:23 np0005548916 virtqemud[228188]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Dec  6 05:27:23 np0005548916 virtqemud[228188]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Dec  6 05:27:23 np0005548916 virtqemud[228188]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Dec  6 05:27:24 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:27:24 np0005548916 ceph-mds[84241]: mds.cephfs.compute-1.fpvjgb asok_command: cache status {prefix=cache status} (starting...)
Dec  6 05:27:24 np0005548916 ceph-mds[84241]: mds.cephfs.compute-1.fpvjgb Can't run that command on an inactive MDS!
Dec  6 05:27:24 np0005548916 lvm[250155]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec  6 05:27:24 np0005548916 ceph-mds[84241]: mds.cephfs.compute-1.fpvjgb asok_command: client ls {prefix=client ls} (starting...)
Dec  6 05:27:24 np0005548916 lvm[250155]: VG ceph_vg0 finished
Dec  6 05:27:24 np0005548916 ceph-mds[84241]: mds.cephfs.compute-1.fpvjgb Can't run that command on an inactive MDS!
Dec  6 05:27:24 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:27:24 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:27:24 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:27:24.793 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:27:24 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:27:24 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:27:24 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:27:24.808 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:27:25 np0005548916 ceph-mds[84241]: mds.cephfs.compute-1.fpvjgb asok_command: damage ls {prefix=damage ls} (starting...)
Dec  6 05:27:25 np0005548916 ceph-mds[84241]: mds.cephfs.compute-1.fpvjgb Can't run that command on an inactive MDS!
Dec  6 05:27:25 np0005548916 ceph-mds[84241]: mds.cephfs.compute-1.fpvjgb asok_command: dump loads {prefix=dump loads} (starting...)
Dec  6 05:27:25 np0005548916 ceph-mds[84241]: mds.cephfs.compute-1.fpvjgb Can't run that command on an inactive MDS!
Dec  6 05:27:25 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "report"} v 0)
Dec  6 05:27:25 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1297629055' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Dec  6 05:27:25 np0005548916 ceph-mds[84241]: mds.cephfs.compute-1.fpvjgb asok_command: dump tree {prefix=dump tree,root=/} (starting...)
Dec  6 05:27:25 np0005548916 ceph-mds[84241]: mds.cephfs.compute-1.fpvjgb Can't run that command on an inactive MDS!
Dec  6 05:27:25 np0005548916 ceph-mds[84241]: mds.cephfs.compute-1.fpvjgb asok_command: dump_blocked_ops {prefix=dump_blocked_ops} (starting...)
Dec  6 05:27:25 np0005548916 ceph-mds[84241]: mds.cephfs.compute-1.fpvjgb Can't run that command on an inactive MDS!
Dec  6 05:27:25 np0005548916 ceph-mds[84241]: mds.cephfs.compute-1.fpvjgb asok_command: dump_historic_ops {prefix=dump_historic_ops} (starting...)
Dec  6 05:27:25 np0005548916 ceph-mds[84241]: mds.cephfs.compute-1.fpvjgb Can't run that command on an inactive MDS!
Dec  6 05:27:25 np0005548916 ceph-mds[84241]: mds.cephfs.compute-1.fpvjgb asok_command: dump_historic_ops_by_duration {prefix=dump_historic_ops_by_duration} (starting...)
Dec  6 05:27:25 np0005548916 ceph-mds[84241]: mds.cephfs.compute-1.fpvjgb Can't run that command on an inactive MDS!
Dec  6 05:27:26 np0005548916 ceph-mds[84241]: mds.cephfs.compute-1.fpvjgb asok_command: dump_ops_in_flight {prefix=dump_ops_in_flight} (starting...)
Dec  6 05:27:26 np0005548916 ceph-mds[84241]: mds.cephfs.compute-1.fpvjgb Can't run that command on an inactive MDS!
Dec  6 05:27:26 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "config log"} v 0)
Dec  6 05:27:26 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/878022195' entity='client.admin' cmd=[{"prefix": "config log"}]: dispatch
Dec  6 05:27:26 np0005548916 ceph-mds[84241]: mds.cephfs.compute-1.fpvjgb asok_command: get subtrees {prefix=get subtrees} (starting...)
Dec  6 05:27:26 np0005548916 ceph-mds[84241]: mds.cephfs.compute-1.fpvjgb Can't run that command on an inactive MDS!
Dec  6 05:27:26 np0005548916 ceph-mds[84241]: mds.cephfs.compute-1.fpvjgb asok_command: ops {prefix=ops} (starting...)
Dec  6 05:27:26 np0005548916 ceph-mds[84241]: mds.cephfs.compute-1.fpvjgb Can't run that command on an inactive MDS!
Dec  6 05:27:26 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "config-key dump"} v 0)
Dec  6 05:27:26 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2966237147' entity='client.admin' cmd=[{"prefix": "config-key dump"}]: dispatch
Dec  6 05:27:26 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "log last", "channel": "cephadm"} v 0)
Dec  6 05:27:26 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/118631599' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm"}]: dispatch
Dec  6 05:27:26 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:27:26 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:27:26 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:27:26.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:27:26 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:27:26 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:27:26 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:27:26.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:27:27 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:27:26 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:27:27 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:27:27 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:27:27 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:27:27 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:27:27 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:27:27 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:27:27 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr dump"} v 0)
Dec  6 05:27:27 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/780400855' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Dec  6 05:27:27 np0005548916 ceph-mds[84241]: mds.cephfs.compute-1.fpvjgb asok_command: session ls {prefix=session ls} (starting...)
Dec  6 05:27:27 np0005548916 ceph-mds[84241]: mds.cephfs.compute-1.fpvjgb Can't run that command on an inactive MDS!
Dec  6 05:27:27 np0005548916 ceph-mds[84241]: mds.cephfs.compute-1.fpvjgb asok_command: status {prefix=status} (starting...)
Dec  6 05:27:27 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr metadata"} v 0)
Dec  6 05:27:27 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2396899628' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Dec  6 05:27:28 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "features"} v 0)
Dec  6 05:27:28 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2579997392' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Dec  6 05:27:28 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr module ls"} v 0)
Dec  6 05:27:28 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3061321017' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Dec  6 05:27:28 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "health", "detail": "detail"} v 0)
Dec  6 05:27:28 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2768020805' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch
Dec  6 05:27:28 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr services"} v 0)
Dec  6 05:27:28 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/389558928' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Dec  6 05:27:28 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:27:28 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:27:28 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:27:28.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:27:28 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:27:28 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:27:28 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:27:28.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:27:29 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr stat"} v 0)
Dec  6 05:27:29 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4226561449' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch
Dec  6 05:27:29 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:27:29 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"} v 0)
Dec  6 05:27:29 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4135499701' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"}]: dispatch
Dec  6 05:27:29 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr versions"} v 0)
Dec  6 05:27:29 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1023966935' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Dec  6 05:27:29 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"} v 0)
Dec  6 05:27:29 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2104238276' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Dec  6 05:27:30 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr dump"} v 0)
Dec  6 05:27:30 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3088289533' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Dec  6 05:27:30 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:27:30 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:27:30 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:27:30.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:27:30 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:27:30 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:27:30 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:27:30.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:27:30 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr metadata"} v 0)
Dec  6 05:27:30 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/907444341' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 928389 data_alloc: 218103808 data_used: 139264
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 928389 data_alloc: 218103808 data_used: 139264
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: osd.0 146 ms_handle_reset con 0x55fb2417d400 session 0x55fb252a5680
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 928389 data_alloc: 218103808 data_used: 139264
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 928389 data_alloc: 218103808 data_used: 139264
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 27.333505630s of 27.336708069s, submitted: 1
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83124224 unmapped: 1687552 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83124224 unmapped: 1687552 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [0,2])
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83148800 unmapped: 1662976 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930049 data_alloc: 218103808 data_used: 135168
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83148800 unmapped: 1662976 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83148800 unmapped: 1662976 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83165184 unmapped: 1646592 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83165184 unmapped: 1646592 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83165184 unmapped: 1646592 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930049 data_alloc: 218103808 data_used: 135168
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83165184 unmapped: 1646592 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83165184 unmapped: 1646592 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83165184 unmapped: 1646592 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83165184 unmapped: 1646592 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.960668564s of 13.129460335s, submitted: 12
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83173376 unmapped: 1638400 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 929310 data_alloc: 218103808 data_used: 139264
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83173376 unmapped: 1638400 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83173376 unmapped: 1638400 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83173376 unmapped: 1638400 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83173376 unmapped: 1638400 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83173376 unmapped: 1638400 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 929310 data_alloc: 218103808 data_used: 139264
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83173376 unmapped: 1638400 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83173376 unmapped: 1638400 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 929310 data_alloc: 218103808 data_used: 139264
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 929310 data_alloc: 218103808 data_used: 139264
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 929310 data_alloc: 218103808 data_used: 139264
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 929310 data_alloc: 218103808 data_used: 139264
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 929310 data_alloc: 218103808 data_used: 139264
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 929310 data_alloc: 218103808 data_used: 139264
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 ms_handle_reset con 0x55fb2417cc00 session 0x55fb271bda40
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 929310 data_alloc: 218103808 data_used: 139264
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 929310 data_alloc: 218103808 data_used: 139264
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 48.355422974s of 48.445949554s, submitted: 1
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83222528 unmapped: 1589248 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83222528 unmapped: 1589248 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83222528 unmapped: 1589248 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 929442 data_alloc: 218103808 data_used: 139264
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 ms_handle_reset con 0x55fb25ef5400 session 0x55fb26d65860
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83222528 unmapped: 1589248 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83222528 unmapped: 1589248 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83222528 unmapped: 1589248 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83222528 unmapped: 1589248 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83222528 unmapped: 1589248 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930970 data_alloc: 218103808 data_used: 135168
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83222528 unmapped: 1589248 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83222528 unmapped: 1589248 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83222528 unmapped: 1589248 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83222528 unmapped: 1589248 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83222528 unmapped: 1589248 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930970 data_alloc: 218103808 data_used: 135168
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83222528 unmapped: 1589248 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 14.050199509s of 14.091516495s, submitted: 5
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83222528 unmapped: 1589248 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83279872 unmapped: 1531904 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83279872 unmapped: 1531904 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83296256 unmapped: 1515520 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930379 data_alloc: 218103808 data_used: 135168
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83296256 unmapped: 1515520 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83296256 unmapped: 1515520 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83304448 unmapped: 1507328 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83304448 unmapped: 1507328 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83304448 unmapped: 1507328 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 931891 data_alloc: 218103808 data_used: 135168
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83304448 unmapped: 1507328 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83304448 unmapped: 1507328 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83304448 unmapped: 1507328 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.031768799s of 12.082296371s, submitted: 15
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83329024 unmapped: 1482752 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83329024 unmapped: 1482752 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 931284 data_alloc: 218103808 data_used: 139264
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83345408 unmapped: 1466368 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83345408 unmapped: 1466368 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83345408 unmapped: 1466368 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83345408 unmapped: 1466368 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83345408 unmapped: 1466368 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 931152 data_alloc: 218103808 data_used: 139264
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83345408 unmapped: 1466368 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83345408 unmapped: 1466368 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83361792 unmapped: 1449984 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83361792 unmapped: 1449984 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83361792 unmapped: 1449984 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 931152 data_alloc: 218103808 data_used: 139264
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83361792 unmapped: 1449984 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83361792 unmapped: 1449984 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83361792 unmapped: 1449984 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83361792 unmapped: 1449984 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83361792 unmapped: 1449984 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 931152 data_alloc: 218103808 data_used: 139264
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83361792 unmapped: 1449984 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83361792 unmapped: 1449984 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83361792 unmapped: 1449984 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83361792 unmapped: 1449984 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83361792 unmapped: 1449984 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 931152 data_alloc: 218103808 data_used: 139264
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83361792 unmapped: 1449984 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83361792 unmapped: 1449984 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83361792 unmapped: 1449984 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83361792 unmapped: 1449984 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83361792 unmapped: 1449984 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 931152 data_alloc: 218103808 data_used: 139264
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83361792 unmapped: 1449984 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83361792 unmapped: 1449984 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 931152 data_alloc: 218103808 data_used: 139264
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 931152 data_alloc: 218103808 data_used: 139264
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 931152 data_alloc: 218103808 data_used: 139264
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 931152 data_alloc: 218103808 data_used: 139264
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 931152 data_alloc: 218103808 data_used: 139264
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 ms_handle_reset con 0x55fb235d5c00 session 0x55fb26f585a0
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 931152 data_alloc: 218103808 data_used: 139264
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 931152 data_alloc: 218103808 data_used: 139264
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 66.122673035s of 66.133468628s, submitted: 3
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 931284 data_alloc: 218103808 data_used: 139264
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83402752 unmapped: 1409024 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83402752 unmapped: 1409024 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83402752 unmapped: 1409024 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 931300 data_alloc: 218103808 data_used: 135168
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83402752 unmapped: 1409024 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83402752 unmapped: 1409024 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83402752 unmapped: 1409024 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83402752 unmapped: 1409024 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83402752 unmapped: 1409024 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 931300 data_alloc: 218103808 data_used: 135168
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83402752 unmapped: 1409024 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.004903793s of 12.174333572s, submitted: 10
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83402752 unmapped: 1409024 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83402752 unmapped: 1409024 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83402752 unmapped: 1409024 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83402752 unmapped: 1409024 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930693 data_alloc: 218103808 data_used: 139264
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83402752 unmapped: 1409024 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930561 data_alloc: 218103808 data_used: 139264
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930561 data_alloc: 218103808 data_used: 139264
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930561 data_alloc: 218103808 data_used: 139264
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930561 data_alloc: 218103808 data_used: 139264
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930561 data_alloc: 218103808 data_used: 139264
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930561 data_alloc: 218103808 data_used: 139264
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930561 data_alloc: 218103808 data_used: 139264
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930561 data_alloc: 218103808 data_used: 139264
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930561 data_alloc: 218103808 data_used: 139264
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930561 data_alloc: 218103808 data_used: 139264
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930561 data_alloc: 218103808 data_used: 139264
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930561 data_alloc: 218103808 data_used: 139264
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930561 data_alloc: 218103808 data_used: 139264
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930561 data_alloc: 218103808 data_used: 139264
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930561 data_alloc: 218103808 data_used: 139264
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930561 data_alloc: 218103808 data_used: 139264
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930561 data_alloc: 218103808 data_used: 139264
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930561 data_alloc: 218103808 data_used: 139264
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1200.1 total, 600.0 interval#012Cumulative writes: 7804 writes, 31K keys, 7804 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.02 MB/s#012Cumulative WAL: 7804 writes, 1639 syncs, 4.76 writes per sync, written: 0.02 GB, 0.02 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 625 writes, 1051 keys, 625 commit groups, 1.0 writes per commit group, ingest: 0.41 MB, 0.00 MB/s#012Interval WAL: 625 writes, 306 syncs, 2.04 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.01              0.00         1    0.013       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.01              0.00         1    0.013       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.1      0.01              0.00         1    0.013       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55fb227cf350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.8e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55fb227cf350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.8e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930561 data_alloc: 218103808 data_used: 139264
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930561 data_alloc: 218103808 data_used: 139264
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 ms_handle_reset con 0x55fb24bd1c00 session 0x55fb26b301e0
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930561 data_alloc: 218103808 data_used: 139264
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930561 data_alloc: 218103808 data_used: 139264
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930561 data_alloc: 218103808 data_used: 139264
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 119.449485779s of 119.461830139s, submitted: 2
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932221 data_alloc: 218103808 data_used: 135168
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83443712 unmapped: 1368064 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83443712 unmapped: 1368064 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83443712 unmapped: 1368064 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83443712 unmapped: 1368064 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932221 data_alloc: 218103808 data_used: 135168
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83443712 unmapped: 1368064 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83443712 unmapped: 1368064 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83443712 unmapped: 1368064 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83443712 unmapped: 1368064 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 14.086344719s of 14.123706818s, submitted: 11
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83443712 unmapped: 1368064 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932089 data_alloc: 218103808 data_used: 135168
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83443712 unmapped: 1368064 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83443712 unmapped: 1368064 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83443712 unmapped: 1368064 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83443712 unmapped: 1368064 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83443712 unmapped: 1368064 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932073 data_alloc: 218103808 data_used: 139264
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83443712 unmapped: 1368064 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83443712 unmapped: 1368064 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83443712 unmapped: 1368064 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83443712 unmapped: 1368064 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83443712 unmapped: 1368064 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932073 data_alloc: 218103808 data_used: 139264
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932073 data_alloc: 218103808 data_used: 139264
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread fragmentation_score=0.000028 took=0.000254s
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932073 data_alloc: 218103808 data_used: 139264
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932073 data_alloc: 218103808 data_used: 139264
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932073 data_alloc: 218103808 data_used: 139264
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932073 data_alloc: 218103808 data_used: 139264
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932073 data_alloc: 218103808 data_used: 139264
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932073 data_alloc: 218103808 data_used: 139264
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932073 data_alloc: 218103808 data_used: 139264
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83460096 unmapped: 1351680 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932073 data_alloc: 218103808 data_used: 139264
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83460096 unmapped: 1351680 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83460096 unmapped: 1351680 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83460096 unmapped: 1351680 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83460096 unmapped: 1351680 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83460096 unmapped: 1351680 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932073 data_alloc: 218103808 data_used: 139264
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83460096 unmapped: 1351680 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83460096 unmapped: 1351680 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83460096 unmapped: 1351680 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83460096 unmapped: 1351680 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83460096 unmapped: 1351680 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932073 data_alloc: 218103808 data_used: 139264
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83460096 unmapped: 1351680 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83460096 unmapped: 1351680 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83460096 unmapped: 1351680 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83460096 unmapped: 1351680 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83460096 unmapped: 1351680 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932073 data_alloc: 218103808 data_used: 139264
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83460096 unmapped: 1351680 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83460096 unmapped: 1351680 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83460096 unmapped: 1351680 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83460096 unmapped: 1351680 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83460096 unmapped: 1351680 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932073 data_alloc: 218103808 data_used: 139264
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83460096 unmapped: 1351680 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83460096 unmapped: 1351680 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83460096 unmapped: 1351680 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83460096 unmapped: 1351680 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 ms_handle_reset con 0x55fb2417d400 session 0x55fb26f51a40
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83460096 unmapped: 1351680 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932073 data_alloc: 218103808 data_used: 139264
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83460096 unmapped: 1351680 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83468288 unmapped: 1343488 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83468288 unmapped: 1343488 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83468288 unmapped: 1343488 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83468288 unmapped: 1343488 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932073 data_alloc: 218103808 data_used: 139264
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83468288 unmapped: 1343488 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83468288 unmapped: 1343488 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83468288 unmapped: 1343488 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83468288 unmapped: 1343488 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83468288 unmapped: 1343488 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932073 data_alloc: 218103808 data_used: 139264
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 90.763069153s of 91.630355835s, submitted: 1
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83476480 unmapped: 1335296 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83476480 unmapped: 1335296 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83476480 unmapped: 1335296 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83476480 unmapped: 1335296 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83476480 unmapped: 1335296 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932221 data_alloc: 218103808 data_used: 135168
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83476480 unmapped: 1335296 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83484672 unmapped: 1327104 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83484672 unmapped: 1327104 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83484672 unmapped: 1327104 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83484672 unmapped: 1327104 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932974 data_alloc: 218103808 data_used: 135168
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83484672 unmapped: 1327104 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83484672 unmapped: 1327104 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83484672 unmapped: 1327104 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83484672 unmapped: 1327104 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 14.195486069s of 14.243807793s, submitted: 12
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83492864 unmapped: 1318912 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932994 data_alloc: 218103808 data_used: 139264
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83492864 unmapped: 1318912 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83492864 unmapped: 1318912 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83492864 unmapped: 1318912 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83615744 unmapped: 1196032 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [0,1])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83877888 unmapped: 933888 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932994 data_alloc: 218103808 data_used: 139264
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83877888 unmapped: 933888 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83877888 unmapped: 933888 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83877888 unmapped: 933888 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83886080 unmapped: 925696 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83886080 unmapped: 925696 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932994 data_alloc: 218103808 data_used: 139264
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83886080 unmapped: 925696 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83886080 unmapped: 925696 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83886080 unmapped: 925696 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83894272 unmapped: 917504 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83894272 unmapped: 917504 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932994 data_alloc: 218103808 data_used: 139264
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83894272 unmapped: 917504 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83894272 unmapped: 917504 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83894272 unmapped: 917504 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83894272 unmapped: 917504 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83894272 unmapped: 917504 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932994 data_alloc: 218103808 data_used: 139264
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83902464 unmapped: 909312 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83902464 unmapped: 909312 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83902464 unmapped: 909312 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83902464 unmapped: 909312 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83902464 unmapped: 909312 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932994 data_alloc: 218103808 data_used: 139264
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83902464 unmapped: 909312 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83902464 unmapped: 909312 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83910656 unmapped: 901120 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83910656 unmapped: 901120 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83910656 unmapped: 901120 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932994 data_alloc: 218103808 data_used: 139264
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83910656 unmapped: 901120 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83910656 unmapped: 901120 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83910656 unmapped: 901120 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83910656 unmapped: 901120 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83910656 unmapped: 901120 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932994 data_alloc: 218103808 data_used: 139264
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83910656 unmapped: 901120 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83910656 unmapped: 901120 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83918848 unmapped: 892928 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83918848 unmapped: 892928 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83918848 unmapped: 892928 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932994 data_alloc: 218103808 data_used: 139264
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83918848 unmapped: 892928 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83918848 unmapped: 892928 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83927040 unmapped: 884736 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83927040 unmapped: 884736 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83927040 unmapped: 884736 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932994 data_alloc: 218103808 data_used: 139264
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83927040 unmapped: 884736 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83927040 unmapped: 884736 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 ms_handle_reset con 0x55fb2417cc00 session 0x55fb268f6000
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83927040 unmapped: 884736 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83927040 unmapped: 884736 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83927040 unmapped: 884736 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932994 data_alloc: 218103808 data_used: 139264
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83927040 unmapped: 884736 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83927040 unmapped: 884736 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83927040 unmapped: 884736 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83927040 unmapped: 884736 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83927040 unmapped: 884736 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932994 data_alloc: 218103808 data_used: 139264
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83927040 unmapped: 884736 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83927040 unmapped: 884736 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83927040 unmapped: 884736 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 57.946811676s of 58.467189789s, submitted: 205
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83927040 unmapped: 884736 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83927040 unmapped: 884736 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 933126 data_alloc: 218103808 data_used: 139264
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83927040 unmapped: 884736 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83927040 unmapped: 884736 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83927040 unmapped: 884736 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83927040 unmapped: 884736 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 936166 data_alloc: 218103808 data_used: 135168
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935407 data_alloc: 218103808 data_used: 135168
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 14.869669914s of 14.905448914s, submitted: 12
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935427 data_alloc: 218103808 data_used: 139264
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935427 data_alloc: 218103808 data_used: 139264
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935427 data_alloc: 218103808 data_used: 139264
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935427 data_alloc: 218103808 data_used: 139264
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935427 data_alloc: 218103808 data_used: 139264
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935427 data_alloc: 218103808 data_used: 139264
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935427 data_alloc: 218103808 data_used: 139264
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935427 data_alloc: 218103808 data_used: 139264
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935427 data_alloc: 218103808 data_used: 139264
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935427 data_alloc: 218103808 data_used: 139264
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935427 data_alloc: 218103808 data_used: 139264
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935427 data_alloc: 218103808 data_used: 139264
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935427 data_alloc: 218103808 data_used: 139264
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935427 data_alloc: 218103808 data_used: 139264
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935427 data_alloc: 218103808 data_used: 139264
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935427 data_alloc: 218103808 data_used: 139264
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935427 data_alloc: 218103808 data_used: 139264
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83943424 unmapped: 868352 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935427 data_alloc: 218103808 data_used: 139264
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83943424 unmapped: 868352 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83943424 unmapped: 868352 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83943424 unmapped: 868352 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83943424 unmapped: 868352 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83943424 unmapped: 868352 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935427 data_alloc: 218103808 data_used: 139264
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 146 handle_osd_map epochs [147,147], i have 146, src has [1,147]
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 93.416572571s of 93.421234131s, submitted: 1
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83943424 unmapped: 868352 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83943424 unmapped: 868352 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 147 handle_osd_map epochs [148,148], i have 147, src has [1,148]
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 148 heartbeat osd_stat(store_statfs(0x4fc1e1000/0x0/0x4ffc00000, data 0x574248/0x62a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83984384 unmapped: 10141696 heap: 94126080 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 148 handle_osd_map epochs [148,149], i have 148, src has [1,149]
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 149 ms_handle_reset con 0x55fb26b43400 session 0x55fb26f9b2c0
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84090880 unmapped: 18432000 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84131840 unmapped: 18391040 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1056837 data_alloc: 218103808 data_used: 143360
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 149 handle_osd_map epochs [149,150], i have 149, src has [1,150]
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 150 ms_handle_reset con 0x55fb26b43400 session 0x55fb26c185a0
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 150 heartbeat osd_stat(store_statfs(0x4fb56a000/0x0/0x4ffc00000, data 0x11e847b/0x12a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84131840 unmapped: 18391040 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84131840 unmapped: 18391040 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84131840 unmapped: 18391040 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 150 heartbeat osd_stat(store_statfs(0x4fb56a000/0x0/0x4ffc00000, data 0x11e847b/0x12a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84131840 unmapped: 18391040 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84131840 unmapped: 18391040 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1062395 data_alloc: 218103808 data_used: 143360
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 150 handle_osd_map epochs [151,151], i have 150, src has [1,151]
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84140032 unmapped: 18382848 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 151 heartbeat osd_stat(store_statfs(0x4fb567000/0x0/0x4ffc00000, data 0x11ea44d/0x12a4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84140032 unmapped: 18382848 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 151 heartbeat osd_stat(store_statfs(0x4fb567000/0x0/0x4ffc00000, data 0x11ea44d/0x12a4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84115456 unmapped: 18407424 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84115456 unmapped: 18407424 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 151 heartbeat osd_stat(store_statfs(0x4fb567000/0x0/0x4ffc00000, data 0x11ea44d/0x12a4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84123648 unmapped: 18399232 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1065153 data_alloc: 218103808 data_used: 143360
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84123648 unmapped: 18399232 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84123648 unmapped: 18399232 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 151 heartbeat osd_stat(store_statfs(0x4fb567000/0x0/0x4ffc00000, data 0x11ea44d/0x12a4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84123648 unmapped: 18399232 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84123648 unmapped: 18399232 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 151 heartbeat osd_stat(store_statfs(0x4fb567000/0x0/0x4ffc00000, data 0x11ea44d/0x12a4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84123648 unmapped: 18399232 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1065153 data_alloc: 218103808 data_used: 143360
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 151 heartbeat osd_stat(store_statfs(0x4fb567000/0x0/0x4ffc00000, data 0x11ea44d/0x12a4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84123648 unmapped: 18399232 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 151 heartbeat osd_stat(store_statfs(0x4fb567000/0x0/0x4ffc00000, data 0x11ea44d/0x12a4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84123648 unmapped: 18399232 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84123648 unmapped: 18399232 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 151 ms_handle_reset con 0x55fb24bd2400 session 0x55fb26acb4a0
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84123648 unmapped: 18399232 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84123648 unmapped: 18399232 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1065153 data_alloc: 218103808 data_used: 143360
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84123648 unmapped: 18399232 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84123648 unmapped: 18399232 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 151 heartbeat osd_stat(store_statfs(0x4fb567000/0x0/0x4ffc00000, data 0x11ea44d/0x12a4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 151 ms_handle_reset con 0x55fb26b42c00 session 0x55fb26db8d20
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84123648 unmapped: 18399232 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84123648 unmapped: 18399232 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84123648 unmapped: 18399232 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1065153 data_alloc: 218103808 data_used: 143360
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84123648 unmapped: 18399232 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84123648 unmapped: 18399232 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 151 heartbeat osd_stat(store_statfs(0x4fb567000/0x0/0x4ffc00000, data 0x11ea44d/0x12a4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84123648 unmapped: 18399232 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84123648 unmapped: 18399232 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 33.286060333s of 33.423255920s, submitted: 40
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84123648 unmapped: 18399232 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1064445 data_alloc: 218103808 data_used: 143360
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84131840 unmapped: 18391040 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84131840 unmapped: 18391040 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 151 heartbeat osd_stat(store_statfs(0x4fb568000/0x0/0x4ffc00000, data 0x11ea44d/0x12a4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84131840 unmapped: 18391040 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 151 heartbeat osd_stat(store_statfs(0x4fb568000/0x0/0x4ffc00000, data 0x11ea44d/0x12a4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84131840 unmapped: 18391040 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84131840 unmapped: 18391040 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1064593 data_alloc: 218103808 data_used: 139264
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84172800 unmapped: 18350080 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84172800 unmapped: 18350080 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 151 heartbeat osd_stat(store_statfs(0x4fb568000/0x0/0x4ffc00000, data 0x11ea44d/0x12a4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84172800 unmapped: 18350080 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 151 heartbeat osd_stat(store_statfs(0x4fb568000/0x0/0x4ffc00000, data 0x11ea44d/0x12a4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84131840 unmapped: 18391040 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84131840 unmapped: 18391040 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 151 heartbeat osd_stat(store_statfs(0x4fb568000/0x0/0x4ffc00000, data 0x11ea44d/0x12a4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1063986 data_alloc: 218103808 data_used: 143360
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84131840 unmapped: 18391040 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.490637779s of 12.528245926s, submitted: 11
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84140032 unmapped: 18382848 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 151 heartbeat osd_stat(store_statfs(0x4fb568000/0x0/0x4ffc00000, data 0x11ea44d/0x12a4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84140032 unmapped: 18382848 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84140032 unmapped: 18382848 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84140032 unmapped: 18382848 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1063702 data_alloc: 218103808 data_used: 139264
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84140032 unmapped: 18382848 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 151 ms_handle_reset con 0x55fb2417d400 session 0x55fb268d2b40
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 88793088 unmapped: 13729792 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 151 handle_osd_map epochs [151,152], i have 151, src has [1,152]
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 89841664 unmapped: 12681216 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 152 heartbeat osd_stat(store_statfs(0x4fb568000/0x0/0x4ffc00000, data 0x11ea44d/0x12a4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 152 handle_osd_map epochs [153,153], i have 152, src has [1,153]
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 90734592 unmapped: 11788288 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 153 ms_handle_reset con 0x55fb24bd1c00 session 0x55fb240bc3c0
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 90210304 unmapped: 12312576 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1113326 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 90243072 unmapped: 12279808 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 90243072 unmapped: 12279808 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 153 heartbeat osd_stat(store_statfs(0x4fb1bc000/0x0/0x4ffc00000, data 0x1592679/0x164e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.364974976s of 10.899864197s, submitted: 40
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 153 ms_handle_reset con 0x55fb24bd1c00 session 0x55fb24f02b40
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 90275840 unmapped: 12247040 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 90275840 unmapped: 12247040 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 93954048 unmapped: 8568832 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1138423 data_alloc: 218103808 data_used: 8523776
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 93954048 unmapped: 8568832 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 153 handle_osd_map epochs [154,154], i have 153, src has [1,154]
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 93954048 unmapped: 8568832 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fb1b9000/0x0/0x4ffc00000, data 0x159466e/0x1652000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 93954048 unmapped: 8568832 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 93954048 unmapped: 8568832 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb235d5c00 session 0x55fb268d3e00
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 93954048 unmapped: 8568832 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1142141 data_alloc: 218103808 data_used: 8527872
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 93954048 unmapped: 8568832 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 93954048 unmapped: 8568832 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fb1b9000/0x0/0x4ffc00000, data 0x159466e/0x1652000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 93954048 unmapped: 8568832 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 93954048 unmapped: 8568832 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fb1b9000/0x0/0x4ffc00000, data 0x159466e/0x1652000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 93954048 unmapped: 8568832 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1142141 data_alloc: 218103808 data_used: 8527872
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.835918427s of 12.857731819s, submitted: 18
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fb1b9000/0x0/0x4ffc00000, data 0x159466e/0x1652000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [0,0,1])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 96157696 unmapped: 6365184 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fad9d000/0x0/0x4ffc00000, data 0x19a366e/0x1a61000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #43. Immutable memtables: 0.
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 99008512 unmapped: 3514368 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 99008512 unmapped: 3514368 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9be3000/0x0/0x4ffc00000, data 0x19b566e/0x1a73000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 99008512 unmapped: 3514368 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1175489 data_alloc: 218103808 data_used: 8544256
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 98172928 unmapped: 4349952 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 98172928 unmapped: 4349952 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 98172928 unmapped: 4349952 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 98172928 unmapped: 4349952 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 98172928 unmapped: 4349952 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9bf9000/0x0/0x4ffc00000, data 0x19b566e/0x1a73000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1175673 data_alloc: 218103808 data_used: 8540160
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 98181120 unmapped: 4341760 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.407831192s of 10.563117027s, submitted: 55
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 98181120 unmapped: 4341760 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 98181120 unmapped: 4341760 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 98181120 unmapped: 4341760 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 98181120 unmapped: 4341760 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9bf9000/0x0/0x4ffc00000, data 0x19b566e/0x1a73000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1175505 data_alloc: 218103808 data_used: 8540160
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 98181120 unmapped: 4341760 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 98181120 unmapped: 4341760 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 98181120 unmapped: 4341760 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9bf9000/0x0/0x4ffc00000, data 0x19b566e/0x1a73000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 98181120 unmapped: 4341760 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 98181120 unmapped: 4341760 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1175205 data_alloc: 218103808 data_used: 8540160
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 98181120 unmapped: 4341760 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 98181120 unmapped: 4341760 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb26b43400 session 0x55fb271ad0e0
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 98181120 unmapped: 4341760 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9bf9000/0x0/0x4ffc00000, data 0x19b566e/0x1a73000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.870066643s of 11.887675285s, submitted: 5
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb26b43800 session 0x55fb271ac000
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 97779712 unmapped: 13139968 heap: 110919680 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 97779712 unmapped: 13139968 heap: 110919680 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f92da000/0x0/0x4ffc00000, data 0x22d466e/0x2392000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1243431 data_alloc: 218103808 data_used: 8544256
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 97845248 unmapped: 13074432 heap: 110919680 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 97845248 unmapped: 13074432 heap: 110919680 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 97845248 unmapped: 13074432 heap: 110919680 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb26b43c00 session 0x55fb26a99e00
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 97845248 unmapped: 13074432 heap: 110919680 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f92da000/0x0/0x4ffc00000, data 0x22d466e/0x2392000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 97845248 unmapped: 13074432 heap: 110919680 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb235d5c00 session 0x55fb23791e00
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1243431 data_alloc: 218103808 data_used: 8544256
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 97861632 unmapped: 13058048 heap: 110919680 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb24bd1c00 session 0x55fb26aca1e0
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb26b43400 session 0x55fb2422b860
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 97607680 unmapped: 13312000 heap: 110919680 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f92b6000/0x0/0x4ffc00000, data 0x22f866e/0x23b6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f92b6000/0x0/0x4ffc00000, data 0x22f866e/0x23b6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 97607680 unmapped: 13312000 heap: 110919680 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 101277696 unmapped: 9641984 heap: 110919680 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105553920 unmapped: 5365760 heap: 110919680 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f92b4000/0x0/0x4ffc00000, data 0x22f966e/0x23b7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1311187 data_alloc: 234881024 data_used: 17436672
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105586688 unmapped: 5332992 heap: 110919680 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105586688 unmapped: 5332992 heap: 110919680 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105586688 unmapped: 5332992 heap: 110919680 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105586688 unmapped: 5332992 heap: 110919680 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105594880 unmapped: 5324800 heap: 110919680 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f92b4000/0x0/0x4ffc00000, data 0x22f966e/0x23b7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1311187 data_alloc: 234881024 data_used: 17436672
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105594880 unmapped: 5324800 heap: 110919680 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f92b4000/0x0/0x4ffc00000, data 0x22f966e/0x23b7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f92b4000/0x0/0x4ffc00000, data 0x22f966e/0x23b7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105594880 unmapped: 5324800 heap: 110919680 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105594880 unmapped: 5324800 heap: 110919680 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb26b42c00 session 0x55fb24bb70e0
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105594880 unmapped: 5324800 heap: 110919680 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 20.669349670s of 20.780309677s, submitted: 12
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109428736 unmapped: 3588096 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1383089 data_alloc: 234881024 data_used: 18948096
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110968832 unmapped: 2048000 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110968832 unmapped: 2048000 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f8aff000/0x0/0x4ffc00000, data 0x2aaf66e/0x2b6d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110968832 unmapped: 2048000 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110968832 unmapped: 2048000 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110968832 unmapped: 2048000 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f8aff000/0x0/0x4ffc00000, data 0x2aaf66e/0x2b6d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1383089 data_alloc: 234881024 data_used: 18948096
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 111001600 unmapped: 2015232 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f8aff000/0x0/0x4ffc00000, data 0x2aaf66e/0x2b6d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 111001600 unmapped: 2015232 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 111001600 unmapped: 2015232 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 111001600 unmapped: 2015232 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 111001600 unmapped: 2015232 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb26b42400 session 0x55fb26d654a0
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb26b43800 session 0x55fb26da5c20
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.124892235s of 11.262226105s, submitted: 63
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb25c3c400 session 0x55fb26a990e0
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1183609 data_alloc: 218103808 data_used: 7954432
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 103227392 unmapped: 9789440 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 103227392 unmapped: 9789440 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9bf8000/0x0/0x4ffc00000, data 0x19b666e/0x1a74000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2417cc00 session 0x55fb26a96f00
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9bf8000/0x0/0x4ffc00000, data 0x19b666e/0x1a74000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 103227392 unmapped: 9789440 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 103227392 unmapped: 9789440 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 103227392 unmapped: 9789440 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9bf8000/0x0/0x4ffc00000, data 0x19b666e/0x1a74000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1183625 data_alloc: 218103808 data_used: 7950336
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 103227392 unmapped: 9789440 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2417d400 session 0x55fb26da4780
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb23f50000 session 0x55fb26f39c20
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100892672 unmapped: 12124160 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3be000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100892672 unmapped: 12124160 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3be000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100892672 unmapped: 12124160 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3be000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3be000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100892672 unmapped: 12124160 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1099353 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100892672 unmapped: 12124160 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100892672 unmapped: 12124160 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100892672 unmapped: 12124160 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3be000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.781655312s of 12.873902321s, submitted: 32
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100900864 unmapped: 12115968 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3bf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3bf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100900864 unmapped: 12115968 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1099077 data_alloc: 218103808 data_used: 4792320
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100909056 unmapped: 12107776 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100917248 unmapped: 12099584 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3bf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100917248 unmapped: 12099584 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3bf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100917248 unmapped: 12099584 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100917248 unmapped: 12099584 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1100589 data_alloc: 218103808 data_used: 4792320
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100917248 unmapped: 12099584 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3bf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100917248 unmapped: 12099584 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100917248 unmapped: 12099584 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100917248 unmapped: 12099584 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100917248 unmapped: 12099584 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3bf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.064629555s of 12.113365173s, submitted: 14
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1100573 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100917248 unmapped: 12099584 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3bf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100917248 unmapped: 12099584 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100917248 unmapped: 12099584 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100917248 unmapped: 12099584 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100917248 unmapped: 12099584 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1100441 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100917248 unmapped: 12099584 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100917248 unmapped: 12099584 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3bf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 104062976 unmapped: 13164544 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2417cc00 session 0x55fb26649680
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100925440 unmapped: 16302080 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100925440 unmapped: 16302080 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9d46000/0x0/0x4ffc00000, data 0x186964b/0x1926000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1152691 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100925440 unmapped: 16302080 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9d46000/0x0/0x4ffc00000, data 0x186964b/0x1926000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100925440 unmapped: 16302080 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100933632 unmapped: 16293888 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 13.373836517s of 13.440272331s, submitted: 18
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb25c3c400 session 0x55fb240be1e0
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100974592 unmapped: 16252928 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9d46000/0x0/0x4ffc00000, data 0x186964b/0x1926000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100982784 unmapped: 16244736 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1179880 data_alloc: 218103808 data_used: 8491008
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 101285888 unmapped: 15941632 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 101965824 unmapped: 15261696 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9d45000/0x0/0x4ffc00000, data 0x186966e/0x1927000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 101965824 unmapped: 15261696 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 101965824 unmapped: 15261696 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 101965824 unmapped: 15261696 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192648 data_alloc: 234881024 data_used: 10371072
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 101965824 unmapped: 15261696 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9d45000/0x0/0x4ffc00000, data 0x186966e/0x1927000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 101965824 unmapped: 15261696 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 101965824 unmapped: 15261696 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9d45000/0x0/0x4ffc00000, data 0x186966e/0x1927000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 101965824 unmapped: 15261696 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 101965824 unmapped: 15261696 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192648 data_alloc: 234881024 data_used: 10371072
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 101965824 unmapped: 15261696 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.659880638s of 12.670410156s, submitted: 4
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106905600 unmapped: 10321920 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 9469952 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 9469952 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9049000/0x0/0x4ffc00000, data 0x255c66e/0x261a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 9469952 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1313916 data_alloc: 234881024 data_used: 12308480
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 9469952 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 9469952 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9049000/0x0/0x4ffc00000, data 0x255c66e/0x261a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106782720 unmapped: 10444800 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106782720 unmapped: 10444800 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106782720 unmapped: 10444800 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1306724 data_alloc: 234881024 data_used: 12316672
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106782720 unmapped: 10444800 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106790912 unmapped: 10436608 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f904f000/0x0/0x4ffc00000, data 0x255f66e/0x261d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106790912 unmapped: 10436608 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106790912 unmapped: 10436608 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106790912 unmapped: 10436608 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1306724 data_alloc: 234881024 data_used: 12316672
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106790912 unmapped: 10436608 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106790912 unmapped: 10436608 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f904f000/0x0/0x4ffc00000, data 0x255f66e/0x261d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106790912 unmapped: 10436608 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106790912 unmapped: 10436608 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106799104 unmapped: 10428416 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1306724 data_alloc: 234881024 data_used: 12316672
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106799104 unmapped: 10428416 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106799104 unmapped: 10428416 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106799104 unmapped: 10428416 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f904f000/0x0/0x4ffc00000, data 0x255f66e/0x261d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106815488 unmapped: 10412032 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106815488 unmapped: 10412032 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308244 data_alloc: 234881024 data_used: 12402688
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106823680 unmapped: 10403840 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 25.410940170s of 25.654003143s, submitted: 125
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106831872 unmapped: 10395648 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb26b42c00 session 0x55fb26da41e0
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2435d000 session 0x55fb24e9bc20
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 102948864 unmapped: 14278656 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9ccc000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 102948864 unmapped: 14278656 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 102948864 unmapped: 14278656 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1112810 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 102948864 unmapped: 14278656 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 102948864 unmapped: 14278656 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9ccc000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 102948864 unmapped: 14278656 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 102948864 unmapped: 14278656 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 102948864 unmapped: 14278656 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9ccc000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1112810 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 102948864 unmapped: 14278656 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 102948864 unmapped: 14278656 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 102948864 unmapped: 14278656 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 102948864 unmapped: 14278656 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 102948864 unmapped: 14278656 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1112810 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9ccc000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 102948864 unmapped: 14278656 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 102948864 unmapped: 14278656 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 102948864 unmapped: 14278656 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 102948864 unmapped: 14278656 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 102948864 unmapped: 14278656 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9ccc000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1112810 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 102948864 unmapped: 14278656 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 102948864 unmapped: 14278656 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 102948864 unmapped: 14278656 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9ccc000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 102948864 unmapped: 14278656 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 102948864 unmapped: 14278656 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1112810 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 102948864 unmapped: 14278656 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 102948864 unmapped: 14278656 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9ccc000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 102948864 unmapped: 14278656 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 102948864 unmapped: 14278656 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 102948864 unmapped: 14278656 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb26c92000 session 0x55fb2422bc20
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2417cc00 session 0x55fb26acb2c0
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2435d000 session 0x55fb26acb860
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb25c3c400 session 0x55fb26acb4a0
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 28.730512619s of 28.779657364s, submitted: 23
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb26b42c00 session 0x55fb26acaf00
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb24bd1c00 session 0x55fb24e881e0
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb24bd1c00 session 0x55fb26aca1e0
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2417cc00 session 0x55fb24e88d20
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163827 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb25c3c400 session 0x55fb24e89c20
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 103374848 unmapped: 22249472 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 103374848 unmapped: 22249472 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9ddf000/0x0/0x4ffc00000, data 0x17cf65b/0x188d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 103374848 unmapped: 22249472 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb26b42c00 session 0x55fb26b30000
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 103374848 unmapped: 22249472 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb261b6c00 session 0x55fb26b301e0
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2417cc00 session 0x55fb26b30780
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 103383040 unmapped: 22241280 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163827 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 103383040 unmapped: 22241280 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 103383040 unmapped: 22241280 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb24bd1c00 session 0x55fb24f02b40
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb235d5c00 session 0x55fb268fe1e0
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 103391232 unmapped: 22233088 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9ddf000/0x0/0x4ffc00000, data 0x17cf65b/0x188d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 103391232 unmapped: 22233088 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105062400 unmapped: 20561920 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1195423 data_alloc: 234881024 data_used: 9515008
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105062400 unmapped: 20561920 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9ddf000/0x0/0x4ffc00000, data 0x17cf65b/0x188d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105062400 unmapped: 20561920 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9ddf000/0x0/0x4ffc00000, data 0x17cf65b/0x188d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105062400 unmapped: 20561920 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105062400 unmapped: 20561920 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105062400 unmapped: 20561920 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9ddf000/0x0/0x4ffc00000, data 0x17cf65b/0x188d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1195423 data_alloc: 234881024 data_used: 9515008
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 104669184 unmapped: 20955136 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 104669184 unmapped: 20955136 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 104677376 unmapped: 20946944 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 17.621421814s of 17.702753067s, submitted: 25
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 104677376 unmapped: 20946944 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105635840 unmapped: 19988480 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1205371 data_alloc: 234881024 data_used: 9601024
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9cc9000/0x0/0x4ffc00000, data 0x18e565b/0x19a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106078208 unmapped: 19546112 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105332736 unmapped: 20291584 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105332736 unmapped: 20291584 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9c3e000/0x0/0x4ffc00000, data 0x196f65b/0x1a2d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105332736 unmapped: 20291584 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105340928 unmapped: 20283392 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1216501 data_alloc: 234881024 data_used: 9588736
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105340928 unmapped: 20283392 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105472000 unmapped: 20152320 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105472000 unmapped: 20152320 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105472000 unmapped: 20152320 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9c20000/0x0/0x4ffc00000, data 0x198e65b/0x1a4c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9c20000/0x0/0x4ffc00000, data 0x198e65b/0x1a4c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.623050690s of 11.763713837s, submitted: 49
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105472000 unmapped: 20152320 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1216802 data_alloc: 234881024 data_used: 9592832
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105472000 unmapped: 20152320 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105472000 unmapped: 20152320 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105529344 unmapped: 20094976 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105529344 unmapped: 20094976 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105529344 unmapped: 20094976 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9c12000/0x0/0x4ffc00000, data 0x199c65b/0x1a5a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1217114 data_alloc: 234881024 data_used: 9592832
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105529344 unmapped: 20094976 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105529344 unmapped: 20094976 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb25fd8000 session 0x55fb240c32c0
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb240d5800 session 0x55fb24f023c0
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb235d5c00 session 0x55fb268fc960
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2417cc00 session 0x55fb26ae6960
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb24bd1c00 session 0x55fb26da43c0
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb25fd8000 session 0x55fb24bd4f00
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb263e8c00 session 0x55fb24bd5c20
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105783296 unmapped: 23511040 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb235d5c00 session 0x55fb24bd5a40
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2417cc00 session 0x55fb24bd43c0
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f93a0000/0x0/0x4ffc00000, data 0x220d66b/0x22cc000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105783296 unmapped: 23511040 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105791488 unmapped: 23502848 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1284570 data_alloc: 234881024 data_used: 9592832
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105791488 unmapped: 23502848 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f939d000/0x0/0x4ffc00000, data 0x221066b/0x22cf000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105791488 unmapped: 23502848 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105791488 unmapped: 23502848 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105791488 unmapped: 23502848 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105791488 unmapped: 23502848 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1316490 data_alloc: 234881024 data_used: 14340096
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 111575040 unmapped: 17719296 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112599040 unmapped: 16695296 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f939d000/0x0/0x4ffc00000, data 0x221066b/0x22cf000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112623616 unmapped: 16670720 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112623616 unmapped: 16670720 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f939d000/0x0/0x4ffc00000, data 0x221066b/0x22cf000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112656384 unmapped: 16637952 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1342938 data_alloc: 234881024 data_used: 18280448
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112656384 unmapped: 16637952 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 21.930198669s of 22.021116257s, submitted: 16
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112656384 unmapped: 16637952 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f939b000/0x0/0x4ffc00000, data 0x221166b/0x22d0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112689152 unmapped: 16605184 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112689152 unmapped: 16605184 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112689152 unmapped: 16605184 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1343466 data_alloc: 234881024 data_used: 18317312
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112803840 unmapped: 16490496 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f939b000/0x0/0x4ffc00000, data 0x221166b/0x22d0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114663424 unmapped: 14630912 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115384320 unmapped: 13910016 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115392512 unmapped: 13901824 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115458048 unmapped: 13836288 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1362194 data_alloc: 234881024 data_used: 18333696
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9264000/0x0/0x4ffc00000, data 0x233366b/0x23f2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115458048 unmapped: 13836288 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb261b7800 session 0x55fb25e2b860
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115458048 unmapped: 13836288 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9264000/0x0/0x4ffc00000, data 0x233366b/0x23f2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1800.1 total, 600.0 interval#012Cumulative writes: 9251 writes, 35K keys, 9251 commit groups, 1.0 writes per commit group, ingest: 0.03 GB, 0.01 MB/s#012Cumulative WAL: 9251 writes, 2253 syncs, 4.11 writes per sync, written: 0.03 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1447 writes, 4631 keys, 1447 commit groups, 1.0 writes per commit group, ingest: 5.55 MB, 0.01 MB/s#012Interval WAL: 1447 writes, 614 syncs, 2.36 writes per sync, written: 0.01 GB, 0.01 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.082611084s of 11.189125061s, submitted: 41
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115490816 unmapped: 13803520 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115490816 unmapped: 13803520 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115490816 unmapped: 13803520 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb24bd1c00 session 0x55fb24eda960
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1362210 data_alloc: 234881024 data_used: 18333696
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114360320 unmapped: 14934016 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb25fd8000 session 0x55fb268fbc20
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109035520 unmapped: 20258816 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9c0e000/0x0/0x4ffc00000, data 0x19a065b/0x1a5e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109035520 unmapped: 20258816 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109035520 unmapped: 20258816 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109035520 unmapped: 20258816 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1224948 data_alloc: 234881024 data_used: 9592832
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109035520 unmapped: 20258816 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109035520 unmapped: 20258816 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb25c3c400 session 0x55fb268d2780
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9c0e000/0x0/0x4ffc00000, data 0x19a065b/0x1a5e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.954633713s of 10.025353432s, submitted: 24
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106209280 unmapped: 23085056 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb235d5c00 session 0x55fb240c14a0
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106209280 unmapped: 23085056 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106209280 unmapped: 23085056 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1130496 data_alloc: 218103808 data_used: 4792320
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106209280 unmapped: 23085056 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106209280 unmapped: 23085056 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106209280 unmapped: 23085056 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3bf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3bf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106250240 unmapped: 23044096 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106250240 unmapped: 23044096 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1130496 data_alloc: 218103808 data_used: 4792320
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106250240 unmapped: 23044096 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106250240 unmapped: 23044096 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3bf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106250240 unmapped: 23044096 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106250240 unmapped: 23044096 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106250240 unmapped: 23044096 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1129757 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106250240 unmapped: 23044096 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106250240 unmapped: 23044096 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3bf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106250240 unmapped: 23044096 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106250240 unmapped: 23044096 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106250240 unmapped: 23044096 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1129757 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106250240 unmapped: 23044096 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106250240 unmapped: 23044096 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106250240 unmapped: 23044096 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3bf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106250240 unmapped: 23044096 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3bf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105627648 unmapped: 23666688 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1129757 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105627648 unmapped: 23666688 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3bf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105627648 unmapped: 23666688 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105627648 unmapped: 23666688 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105627648 unmapped: 23666688 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105627648 unmapped: 23666688 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1129757 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105627648 unmapped: 23666688 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 29.085247040s of 29.158304214s, submitted: 25
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2417cc00 session 0x55fb26f503c0
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105439232 unmapped: 23855104 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9dda000/0x0/0x4ffc00000, data 0x17d564b/0x1892000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105439232 unmapped: 23855104 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105439232 unmapped: 23855104 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105439232 unmapped: 23855104 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1178729 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105439232 unmapped: 23855104 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb24bd1c00 session 0x55fb24bd74a0
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105439232 unmapped: 23855104 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106323968 unmapped: 22970368 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9db6000/0x0/0x4ffc00000, data 0x17f964b/0x18b6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9db6000/0x0/0x4ffc00000, data 0x17f964b/0x18b6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106946560 unmapped: 22347776 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106946560 unmapped: 22347776 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1222909 data_alloc: 234881024 data_used: 10895360
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106946560 unmapped: 22347776 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9db6000/0x0/0x4ffc00000, data 0x17f964b/0x18b6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106946560 unmapped: 22347776 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9db6000/0x0/0x4ffc00000, data 0x17f964b/0x18b6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106946560 unmapped: 22347776 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106946560 unmapped: 22347776 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106946560 unmapped: 22347776 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1222909 data_alloc: 234881024 data_used: 10895360
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106954752 unmapped: 22339584 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106954752 unmapped: 22339584 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9db6000/0x0/0x4ffc00000, data 0x17f964b/0x18b6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106954752 unmapped: 22339584 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106954752 unmapped: 22339584 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9db6000/0x0/0x4ffc00000, data 0x17f964b/0x18b6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 17.328538895s of 17.374874115s, submitted: 11
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114114560 unmapped: 15179776 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1297901 data_alloc: 234881024 data_used: 12738560
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116293632 unmapped: 13000704 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9697000/0x0/0x4ffc00000, data 0x1f0a64b/0x1fc7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116301824 unmapped: 12992512 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9697000/0x0/0x4ffc00000, data 0x1f0a64b/0x1fc7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116301824 unmapped: 12992512 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9697000/0x0/0x4ffc00000, data 0x1f0a64b/0x1fc7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116301824 unmapped: 12992512 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116367360 unmapped: 12926976 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1298205 data_alloc: 234881024 data_used: 12746752
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116367360 unmapped: 12926976 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116367360 unmapped: 12926976 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116367360 unmapped: 12926976 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116375552 unmapped: 12918784 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9697000/0x0/0x4ffc00000, data 0x1f0a64b/0x1fc7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116375552 unmapped: 12918784 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1298509 data_alloc: 234881024 data_used: 12754944
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116375552 unmapped: 12918784 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9697000/0x0/0x4ffc00000, data 0x1f0a64b/0x1fc7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116375552 unmapped: 12918784 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116375552 unmapped: 12918784 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9697000/0x0/0x4ffc00000, data 0x1f0a64b/0x1fc7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9697000/0x0/0x4ffc00000, data 0x1f0a64b/0x1fc7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116375552 unmapped: 12918784 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116375552 unmapped: 12918784 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1298509 data_alloc: 234881024 data_used: 12754944
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116375552 unmapped: 12918784 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116391936 unmapped: 12902400 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116391936 unmapped: 12902400 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116391936 unmapped: 12902400 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9697000/0x0/0x4ffc00000, data 0x1f0a64b/0x1fc7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116400128 unmapped: 12894208 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1299269 data_alloc: 234881024 data_used: 12775424
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116400128 unmapped: 12894208 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116400128 unmapped: 12894208 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116408320 unmapped: 12886016 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9697000/0x0/0x4ffc00000, data 0x1f0a64b/0x1fc7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116408320 unmapped: 12886016 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9697000/0x0/0x4ffc00000, data 0x1f0a64b/0x1fc7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116408320 unmapped: 12886016 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1299269 data_alloc: 234881024 data_used: 12775424
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116408320 unmapped: 12886016 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9697000/0x0/0x4ffc00000, data 0x1f0a64b/0x1fc7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116408320 unmapped: 12886016 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2435d000 session 0x55fb24bd4b40
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb235d5c00 session 0x55fb24bd5680
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2417cc00 session 0x55fb271ac780
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2435d000 session 0x55fb271ac000
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 28.524518967s of 28.681346893s, submitted: 84
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb24bd1c00 session 0x55fb26c18d20
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb25c3c400 session 0x55fb24bd5860
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb235d5c00 session 0x55fb26b31c20
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114810880 unmapped: 14483456 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2417cc00 session 0x55fb24bd43c0
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2435d000 session 0x55fb26ae72c0
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9292000/0x0/0x4ffc00000, data 0x231c65b/0x23da000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114810880 unmapped: 14483456 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114810880 unmapped: 14483456 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1323404 data_alloc: 234881024 data_used: 12775424
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114810880 unmapped: 14483456 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114319360 unmapped: 14974976 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114319360 unmapped: 14974976 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb24bd1c00 session 0x55fb26f594a0
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9291000/0x0/0x4ffc00000, data 0x231c67e/0x23db000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114319360 unmapped: 14974976 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9291000/0x0/0x4ffc00000, data 0x231c67e/0x23db000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114360320 unmapped: 14934016 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1352973 data_alloc: 234881024 data_used: 16846848
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116736000 unmapped: 12558336 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116776960 unmapped: 12517376 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9291000/0x0/0x4ffc00000, data 0x231c67e/0x23db000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116809728 unmapped: 12484608 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116809728 unmapped: 12484608 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116809728 unmapped: 12484608 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1353581 data_alloc: 234881024 data_used: 16908288
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116809728 unmapped: 12484608 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116809728 unmapped: 12484608 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9291000/0x0/0x4ffc00000, data 0x231c67e/0x23db000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116842496 unmapped: 12451840 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116842496 unmapped: 12451840 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116842496 unmapped: 12451840 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 17.605432510s of 17.711872101s, submitted: 26
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1406263 data_alloc: 234881024 data_used: 16941056
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 118104064 unmapped: 11190272 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 118857728 unmapped: 10436608 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 118857728 unmapped: 10436608 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f8b0f000/0x0/0x4ffc00000, data 0x2a9e67e/0x2b5d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 118890496 unmapped: 10403840 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 118890496 unmapped: 10403840 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1416385 data_alloc: 234881024 data_used: 16928768
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f8b0f000/0x0/0x4ffc00000, data 0x2a9e67e/0x2b5d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 118890496 unmapped: 10403840 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 118890496 unmapped: 10403840 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 117432320 unmapped: 11862016 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 117440512 unmapped: 11853824 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 117440512 unmapped: 11853824 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1413273 data_alloc: 234881024 data_used: 16928768
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 117440512 unmapped: 11853824 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f8aeb000/0x0/0x4ffc00000, data 0x2ac267e/0x2b81000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 117440512 unmapped: 11853824 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.007748604s of 12.300899506s, submitted: 81
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 117440512 unmapped: 11853824 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb25e41400 session 0x55fb25e2ba40
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 117440512 unmapped: 11853824 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb235d5c00 session 0x55fb26c2a000
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f8aeb000/0x0/0x4ffc00000, data 0x2ac267e/0x2b81000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 113590272 unmapped: 15704064 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1298825 data_alloc: 234881024 data_used: 12820480
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 113590272 unmapped: 15704064 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 113590272 unmapped: 15704064 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f96a4000/0x0/0x4ffc00000, data 0x1f0a64b/0x1fc7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 113590272 unmapped: 15704064 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 113590272 unmapped: 15704064 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 113590272 unmapped: 15704064 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1298993 data_alloc: 234881024 data_used: 12820480
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 113590272 unmapped: 15704064 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 113590272 unmapped: 15704064 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f96a4000/0x0/0x4ffc00000, data 0x1f0a64b/0x1fc7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 113590272 unmapped: 15704064 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 113590272 unmapped: 15704064 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb263e9000 session 0x55fb26c192c0
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb261b7800 session 0x55fb24e9b680
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 113590272 unmapped: 15704064 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.708964348s of 12.890141487s, submitted: 37
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2417cc00 session 0x55fb238f32c0
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1145549 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110436352 unmapped: 18857984 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110436352 unmapped: 18857984 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3bf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110436352 unmapped: 18857984 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110436352 unmapped: 18857984 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110436352 unmapped: 18857984 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1145549 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110436352 unmapped: 18857984 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110436352 unmapped: 18857984 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3bf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110436352 unmapped: 18857984 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb25e58c00 session 0x55fb24e88780
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb25e59000 session 0x55fb24bb8d20
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110436352 unmapped: 18857984 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110436352 unmapped: 18857984 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb25e59c00 session 0x55fb24e890e0
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1145549 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110436352 unmapped: 18857984 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3bf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3bf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110436352 unmapped: 18857984 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb25ef4800 session 0x55fb24edad20
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110436352 unmapped: 18857984 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110436352 unmapped: 18857984 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110436352 unmapped: 18857984 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1145549 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110436352 unmapped: 18857984 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3bf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 16.401422501s of 16.425735474s, submitted: 9
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109821952 unmapped: 19472384 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109920256 unmapped: 19374080 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3bf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2417d400 session 0x55fb25e2c000
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110018560 unmapped: 19275776 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110018560 unmapped: 19275776 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3bf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1145549 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110018560 unmapped: 19275776 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110018560 unmapped: 19275776 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110018560 unmapped: 19275776 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110018560 unmapped: 19275776 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110018560 unmapped: 19275776 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3bf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb25ef4800 session 0x55fb26da4b40
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1166053 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110075904 unmapped: 19218432 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110075904 unmapped: 19218432 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110075904 unmapped: 19218432 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa220000/0x0/0x4ffc00000, data 0x138f64b/0x144c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110075904 unmapped: 19218432 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.438837051s of 13.030948639s, submitted: 230
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109223936 unmapped: 20070400 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2435d000 session 0x55fb26f503c0
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb24bd1c00 session 0x55fb26f50d20
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1165513 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109207552 unmapped: 20086784 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb26b42c00 session 0x55fb25107680
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb26b42c00 session 0x55fb25e2ba40
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109207552 unmapped: 20086784 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa21e000/0x0/0x4ffc00000, data 0x138f67e/0x144e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109305856 unmapped: 19988480 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa21e000/0x0/0x4ffc00000, data 0x138f67e/0x144e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109305856 unmapped: 19988480 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109305856 unmapped: 19988480 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1182596 data_alloc: 218103808 data_used: 6209536
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109330432 unmapped: 19963904 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109330432 unmapped: 19963904 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109330432 unmapped: 19963904 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa21e000/0x0/0x4ffc00000, data 0x138f67e/0x144e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2435d000 session 0x55fb24bb92c0
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2417d400 session 0x55fb271ac780
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109330432 unmapped: 19963904 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb24bd1c00 session 0x55fb268fd2c0
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107749376 unmapped: 21544960 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1153451 data_alloc: 218103808 data_used: 4792320
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107749376 unmapped: 21544960 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107749376 unmapped: 21544960 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3be000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107749376 unmapped: 21544960 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21536768 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3be000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21536768 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3be000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1153603 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107765760 unmapped: 21528576 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 16.980836868s of 17.077106476s, submitted: 31
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107765760 unmapped: 21528576 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107765760 unmapped: 21528576 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3be000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107765760 unmapped: 21528576 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107765760 unmapped: 21528576 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1153471 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107765760 unmapped: 21528576 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3be000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 108363776 unmapped: 20930560 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb25ef4800 session 0x55fb26db9860
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 108363776 unmapped: 20930560 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 108371968 unmapped: 20922368 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 108371968 unmapped: 20922368 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb25ef4800 session 0x55fb24bd7c20
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1162335 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2417d400 session 0x55fb26f501e0
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 108371968 unmapped: 20922368 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa2ea000/0x0/0x4ffc00000, data 0x12c564b/0x1382000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2435d000 session 0x55fb26f50f00
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.982028008s of 10.000374794s, submitted: 6
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb24bd1c00 session 0x55fb243d0960
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 108380160 unmapped: 20914176 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 108380160 unmapped: 20914176 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 108380160 unmapped: 20914176 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb26b42c00 session 0x55fb26db83c0
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2417d400 session 0x55fb26f510e0
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107429888 unmapped: 21864448 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1155300 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107438080 unmapped: 21856256 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3be000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107446272 unmapped: 21848064 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107446272 unmapped: 21848064 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107446272 unmapped: 21848064 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107446272 unmapped: 21848064 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3be000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1155300 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107446272 unmapped: 21848064 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107446272 unmapped: 21848064 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3be000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107446272 unmapped: 21848064 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107446272 unmapped: 21848064 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107446272 unmapped: 21848064 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3be000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1155300 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107446272 unmapped: 21848064 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107446272 unmapped: 21848064 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107446272 unmapped: 21848064 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107446272 unmapped: 21848064 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107446272 unmapped: 21848064 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3be000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1155300 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107454464 unmapped: 21839872 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107462656 unmapped: 21831680 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107462656 unmapped: 21831680 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107462656 unmapped: 21831680 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107462656 unmapped: 21831680 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1155300 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3be000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107462656 unmapped: 21831680 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107462656 unmapped: 21831680 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107462656 unmapped: 21831680 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107462656 unmapped: 21831680 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3be000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107470848 unmapped: 21823488 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 29.154647827s of 29.205055237s, submitted: 16
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2435d000 session 0x55fb26da4960
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1196112 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107470848 unmapped: 24977408 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107470848 unmapped: 24977408 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107479040 unmapped: 24969216 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9dbf000/0x0/0x4ffc00000, data 0x17f064b/0x18ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107479040 unmapped: 24969216 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107479040 unmapped: 24969216 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1196112 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107479040 unmapped: 24969216 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9dbf000/0x0/0x4ffc00000, data 0x17f064b/0x18ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb24bd1c00 session 0x55fb26c2b2c0
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107642880 unmapped: 24805376 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107642880 unmapped: 24805376 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9d9a000/0x0/0x4ffc00000, data 0x181466e/0x18d2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109158400 unmapped: 23289856 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109158400 unmapped: 23289856 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1243041 data_alloc: 234881024 data_used: 11091968
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9d9a000/0x0/0x4ffc00000, data 0x181466e/0x18d2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109158400 unmapped: 23289856 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109158400 unmapped: 23289856 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109158400 unmapped: 23289856 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109158400 unmapped: 23289856 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb25ef5800 session 0x55fb2719dc20
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109158400 unmapped: 23289856 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1243041 data_alloc: 234881024 data_used: 11091968
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109166592 unmapped: 23281664 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9d9a000/0x0/0x4ffc00000, data 0x181466e/0x18d2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109166592 unmapped: 23281664 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9d9a000/0x0/0x4ffc00000, data 0x181466e/0x18d2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109166592 unmapped: 23281664 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: mgrc ms_handle_reset ms_handle_reset con 0x55fb26150000
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: mgrc reconnect Terminating session with v2:192.168.122.100:6800/3885409716
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: mgrc reconnect Starting new session with [v2:192.168.122.100:6800/3885409716,v1:192.168.122.100:6801/3885409716]
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: mgrc handle_mgr_configure stats_period=5
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109133824 unmapped: 23314432 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 18.384727478s of 18.432491302s, submitted: 9
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9a06000/0x0/0x4ffc00000, data 0x1ba866e/0x1c66000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb263e9000 session 0x55fb268d21e0
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112214016 unmapped: 20234240 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1283175 data_alloc: 234881024 data_used: 11640832
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112279552 unmapped: 20168704 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112279552 unmapped: 20168704 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112279552 unmapped: 20168704 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112271360 unmapped: 20176896 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f99f8000/0x0/0x4ffc00000, data 0x1bb666e/0x1c74000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112271360 unmapped: 20176896 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f99f8000/0x0/0x4ffc00000, data 0x1bb666e/0x1c74000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1282131 data_alloc: 234881024 data_used: 11640832
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112271360 unmapped: 20176896 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112271360 unmapped: 20176896 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112271360 unmapped: 20176896 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f99f8000/0x0/0x4ffc00000, data 0x1bb666e/0x1c74000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112263168 unmapped: 20185088 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112263168 unmapped: 20185088 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.296810150s of 11.368459702s, submitted: 31
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1282111 data_alloc: 234881024 data_used: 11636736
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112263168 unmapped: 20185088 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112263168 unmapped: 20185088 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f99f8000/0x0/0x4ffc00000, data 0x1bb666e/0x1c74000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2892dc00 session 0x55fb271ad680
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb25ef4800 session 0x55fb268fe000
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112263168 unmapped: 20185088 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb24bd1c00 session 0x55fb25e2a1e0
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109723648 unmapped: 22724608 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109723648 unmapped: 22724608 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9fae000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163642 data_alloc: 218103808 data_used: 4792320
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109723648 unmapped: 22724608 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109723648 unmapped: 22724608 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109772800 unmapped: 22675456 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109764608 unmapped: 22683648 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109764608 unmapped: 22683648 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.109940529s of 10.186728477s, submitted: 30
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163202 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109764608 unmapped: 22683648 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109764608 unmapped: 22683648 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109764608 unmapped: 22683648 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109764608 unmapped: 22683648 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109764608 unmapped: 22683648 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163070 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109764608 unmapped: 22683648 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109764608 unmapped: 22683648 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109764608 unmapped: 22683648 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109764608 unmapped: 22683648 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109764608 unmapped: 22683648 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163070 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109764608 unmapped: 22683648 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109764608 unmapped: 22683648 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109764608 unmapped: 22683648 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109764608 unmapped: 22683648 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109764608 unmapped: 22683648 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163070 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109772800 unmapped: 22675456 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 15.609736443s of 15.626793861s, submitted: 5
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb25ef5800 session 0x55fb268d32c0
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110092288 unmapped: 26034176 heap: 136126464 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110092288 unmapped: 26034176 heap: 136126464 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110092288 unmapped: 26034176 heap: 136126464 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f99b7000/0x0/0x4ffc00000, data 0x17e864b/0x18a5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110092288 unmapped: 26034176 heap: 136126464 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1209438 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb25ef5800 session 0x55fb26649e00
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110403584 unmapped: 25722880 heap: 136126464 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110403584 unmapped: 25722880 heap: 136126464 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2417d400 session 0x55fb268f6960
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110706688 unmapped: 25419776 heap: 136126464 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9993000/0x0/0x4ffc00000, data 0x180c64b/0x18c9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112590848 unmapped: 23535616 heap: 136126464 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112590848 unmapped: 23535616 heap: 136126464 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9993000/0x0/0x4ffc00000, data 0x180c64b/0x18c9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1251390 data_alloc: 234881024 data_used: 10969088
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112590848 unmapped: 23535616 heap: 136126464 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112590848 unmapped: 23535616 heap: 136126464 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112590848 unmapped: 23535616 heap: 136126464 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112590848 unmapped: 23535616 heap: 136126464 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112590848 unmapped: 23535616 heap: 136126464 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9993000/0x0/0x4ffc00000, data 0x180c64b/0x18c9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9993000/0x0/0x4ffc00000, data 0x180c64b/0x18c9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1251390 data_alloc: 234881024 data_used: 10969088
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112590848 unmapped: 23535616 heap: 136126464 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112590848 unmapped: 23535616 heap: 136126464 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9993000/0x0/0x4ffc00000, data 0x180c64b/0x18c9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 16.847808838s of 16.891704559s, submitted: 6
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb263e9000 session 0x55fb240c2f00
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112812032 unmapped: 26992640 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115073024 unmapped: 24731648 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115073024 unmapped: 24731648 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1386308 data_alloc: 234881024 data_used: 11198464
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115712000 unmapped: 24092672 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f8877000/0x0/0x4ffc00000, data 0x292864b/0x29e5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2892d800 session 0x55fb240c10e0
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115712000 unmapped: 24092672 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2892d400 session 0x55fb271acd20
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2417d400 session 0x55fb26a99c20
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb25ef5800 session 0x55fb268fe1e0
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115712000 unmapped: 24092672 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115712000 unmapped: 24092672 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 119521280 unmapped: 20283392 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1471204 data_alloc: 234881024 data_used: 23625728
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 125157376 unmapped: 14647296 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f8874000/0x0/0x4ffc00000, data 0x292b64b/0x29e8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 125157376 unmapped: 14647296 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 125157376 unmapped: 14647296 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 125157376 unmapped: 14647296 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 125157376 unmapped: 14647296 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f8874000/0x0/0x4ffc00000, data 0x292b64b/0x29e8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1472076 data_alloc: 234881024 data_used: 23629824
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 125157376 unmapped: 14647296 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f8874000/0x0/0x4ffc00000, data 0x292b64b/0x29e8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 125157376 unmapped: 14647296 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 125157376 unmapped: 14647296 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 125157376 unmapped: 14647296 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 125214720 unmapped: 14589952 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 17.025381088s of 17.272668839s, submitted: 73
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f84cf000/0x0/0x4ffc00000, data 0x2ca464b/0x2d61000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1526224 data_alloc: 234881024 data_used: 23797760
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 129245184 unmapped: 10559488 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 129335296 unmapped: 10469376 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 129417216 unmapped: 10387456 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 129417216 unmapped: 10387456 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f814b000/0x0/0x4ffc00000, data 0x305364b/0x3110000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 129433600 unmapped: 10371072 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1530806 data_alloc: 234881024 data_used: 23859200
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 129433600 unmapped: 10371072 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 129433600 unmapped: 10371072 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 129458176 unmapped: 10346496 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f8149000/0x0/0x4ffc00000, data 0x305664b/0x3113000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 129490944 unmapped: 10313728 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 129490944 unmapped: 10313728 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1528822 data_alloc: 234881024 data_used: 23859200
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 129490944 unmapped: 10313728 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f8149000/0x0/0x4ffc00000, data 0x305664b/0x3113000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 129490944 unmapped: 10313728 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f8149000/0x0/0x4ffc00000, data 0x305664b/0x3113000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 129490944 unmapped: 10313728 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f8149000/0x0/0x4ffc00000, data 0x305664b/0x3113000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 129490944 unmapped: 10313728 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 129490944 unmapped: 10313728 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1528822 data_alloc: 234881024 data_used: 23859200
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 15.850658417s of 15.982189178s, submitted: 58
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 129490944 unmapped: 10313728 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb263e9000 session 0x55fb26ae6b40
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2892d400 session 0x55fb25e2cb40
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 119799808 unmapped: 20004864 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 119799808 unmapped: 20004864 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f91ec000/0x0/0x4ffc00000, data 0x1c3d64b/0x1cfa000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 119799808 unmapped: 20004864 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f91ec000/0x0/0x4ffc00000, data 0x1c3d64b/0x1cfa000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 119799808 unmapped: 20004864 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1305206 data_alloc: 234881024 data_used: 11198464
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 119799808 unmapped: 20004864 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 119799808 unmapped: 20004864 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f91ec000/0x0/0x4ffc00000, data 0x1c3d64b/0x1cfa000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 119799808 unmapped: 20004864 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb25ef4800 session 0x55fb25107c20
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb24bd1c00 session 0x55fb268fad20
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115548160 unmapped: 24256512 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2417d400 session 0x55fb24e894a0
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115556352 unmapped: 24248320 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1182518 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 24428544 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 24428544 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 24428544 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 24428544 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 24428544 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1182518 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 24428544 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 24428544 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 24428544 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 24428544 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 24428544 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1182518 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 24428544 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 24428544 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 24428544 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 24428544 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 24428544 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1182518 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 24428544 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 24428544 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 24428544 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 24428544 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 24428544 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1182518 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 24428544 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 24428544 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 24428544 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 24428544 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb25ef5800 session 0x55fb24bd50e0
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb263e9000 session 0x55fb24f02000
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2892d400 session 0x55fb26ae7a40
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2892d400 session 0x55fb24bb85a0
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 33.909008026s of 33.959445953s, submitted: 21
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2417d400 session 0x55fb2422b680
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb24bd1c00 session 0x55fb26ae6000
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb25ef5800 session 0x55fb24bd5680
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115572736 unmapped: 24231936 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb263e9000 session 0x55fb24f03c20
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb263e9000 session 0x55fb24bd4960
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9b13000/0x0/0x4ffc00000, data 0x168c64b/0x1749000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1222243 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115580928 unmapped: 24223744 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115580928 unmapped: 24223744 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115580928 unmapped: 24223744 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9b13000/0x0/0x4ffc00000, data 0x168c64b/0x1749000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115580928 unmapped: 24223744 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2417d400 session 0x55fb26a972c0
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb24bd1c00 session 0x55fb240c32c0
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115580928 unmapped: 24223744 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb25ef5800 session 0x55fb240c21e0
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2892d400 session 0x55fb24e89c20
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1224048 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115572736 unmapped: 24231936 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115580928 unmapped: 24223744 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115761152 unmapped: 24043520 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115761152 unmapped: 24043520 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9b12000/0x0/0x4ffc00000, data 0x168c66e/0x174a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115761152 unmapped: 24043520 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1249716 data_alloc: 218103808 data_used: 8503296
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115761152 unmapped: 24043520 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115761152 unmapped: 24043520 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9b12000/0x0/0x4ffc00000, data 0x168c66e/0x174a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115761152 unmapped: 24043520 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115761152 unmapped: 24043520 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115769344 unmapped: 24035328 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1249716 data_alloc: 218103808 data_used: 8503296
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9b12000/0x0/0x4ffc00000, data 0x168c66e/0x174a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115769344 unmapped: 24035328 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115769344 unmapped: 24035328 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115769344 unmapped: 24035328 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 18.381628036s of 18.489994049s, submitted: 27
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 119668736 unmapped: 20135936 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 119840768 unmapped: 19963904 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1326434 data_alloc: 218103808 data_used: 8761344
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 119840768 unmapped: 19963904 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9225000/0x0/0x4ffc00000, data 0x1f6a66e/0x2028000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 119840768 unmapped: 19963904 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 119840768 unmapped: 19963904 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9225000/0x0/0x4ffc00000, data 0x1f6a66e/0x2028000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 119840768 unmapped: 19963904 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 119840768 unmapped: 19963904 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1326434 data_alloc: 218103808 data_used: 8761344
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 119840768 unmapped: 19963904 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 119840768 unmapped: 19963904 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 119840768 unmapped: 19963904 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9225000/0x0/0x4ffc00000, data 0x1f6a66e/0x2028000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 119840768 unmapped: 19963904 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 119840768 unmapped: 19963904 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1326434 data_alloc: 218103808 data_used: 8761344
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 119840768 unmapped: 19963904 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 119840768 unmapped: 19963904 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 119840768 unmapped: 19963904 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 15.395916939s of 15.577631950s, submitted: 77
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2892d400 session 0x55fb26c2a5a0
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9234000/0x0/0x4ffc00000, data 0x1f6a66e/0x2028000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 118964224 unmapped: 20840448 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2892c800 session 0x55fb25e2cb40
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1193304 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9fae000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9fae000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1193304 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9fae000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2435d000 session 0x55fb26c192c0
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9fae000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1193304 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9fae000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1193304 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9fae000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9fae000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9fae000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1193304 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9fae000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1193304 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9fae000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9fae000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1193304 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb25fd8000 session 0x55fb26f38000
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 35.909954071s of 36.011264801s, submitted: 36
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9fae000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115818496 unmapped: 23986176 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1193172 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115818496 unmapped: 23986176 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115818496 unmapped: 23986176 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115818496 unmapped: 23986176 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9fae000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115818496 unmapped: 23986176 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115818496 unmapped: 23986176 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1193172 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115818496 unmapped: 23986176 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115818496 unmapped: 23986176 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9fae000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115818496 unmapped: 23986176 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115818496 unmapped: 23986176 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9fae000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115826688 unmapped: 23977984 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1193172 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115826688 unmapped: 23977984 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115826688 unmapped: 23977984 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115826688 unmapped: 23977984 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115826688 unmapped: 23977984 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9fae000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115826688 unmapped: 23977984 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1193172 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115826688 unmapped: 23977984 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115826688 unmapped: 23977984 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115826688 unmapped: 23977984 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9fae000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115834880 unmapped: 23969792 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9fae000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2892dc00 session 0x55fb26da45a0
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115834880 unmapped: 23969792 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 21.256072998s of 21.260541916s, submitted: 1
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115834880 unmapped: 23969792 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115834880 unmapped: 23969792 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115834880 unmapped: 23969792 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115834880 unmapped: 23969792 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115834880 unmapped: 23969792 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115834880 unmapped: 23969792 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115834880 unmapped: 23969792 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115834880 unmapped: 23969792 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115834880 unmapped: 23969792 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115834880 unmapped: 23969792 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.078483582s of 10.083856583s, submitted: 1
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192880 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115834880 unmapped: 23969792 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115834880 unmapped: 23969792 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115834880 unmapped: 23969792 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115834880 unmapped: 23969792 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115843072 unmapped: 23961600 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192896 data_alloc: 218103808 data_used: 4792320
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115843072 unmapped: 23961600 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115843072 unmapped: 23961600 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115843072 unmapped: 23961600 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115843072 unmapped: 23961600 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115843072 unmapped: 23961600 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192896 data_alloc: 218103808 data_used: 4792320
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115843072 unmapped: 23961600 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115851264 unmapped: 23953408 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115851264 unmapped: 23953408 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115851264 unmapped: 23953408 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115851264 unmapped: 23953408 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192896 data_alloc: 218103808 data_used: 4792320
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115851264 unmapped: 23953408 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115810304 unmapped: 23994368 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 16.716539383s of 16.728366852s, submitted: 3
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: do_command 'config diff' '{prefix=config diff}'
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: do_command 'config show' '{prefix=config show}'
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: do_command 'counter dump' '{prefix=counter dump}'
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: do_command 'counter schema' '{prefix=counter schema}'
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115564544 unmapped: 24240128 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115302400 unmapped: 24502272 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115712000 unmapped: 24092672 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: do_command 'log dump' '{prefix=log dump}'
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: do_command 'log dump' '{prefix=log dump}' result is 0 bytes
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192764 data_alloc: 218103808 data_used: 4792320
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: do_command 'perf dump' '{prefix=perf dump}'
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: do_command 'perf dump' '{prefix=perf dump}' result is 0 bytes
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: do_command 'perf histogram dump' '{prefix=perf histogram dump}'
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: do_command 'perf histogram dump' '{prefix=perf histogram dump}' result is 0 bytes
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115605504 unmapped: 35241984 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: do_command 'perf schema' '{prefix=perf schema}'
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: do_command 'perf schema' '{prefix=perf schema}' result is 0 bytes
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115089408 unmapped: 35758080 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115089408 unmapped: 35758080 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115089408 unmapped: 35758080 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115089408 unmapped: 35758080 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192764 data_alloc: 218103808 data_used: 4792320
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115089408 unmapped: 35758080 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115089408 unmapped: 35758080 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115089408 unmapped: 35758080 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115089408 unmapped: 35758080 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115097600 unmapped: 35749888 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192764 data_alloc: 218103808 data_used: 4792320
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115097600 unmapped: 35749888 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115097600 unmapped: 35749888 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115097600 unmapped: 35749888 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115097600 unmapped: 35749888 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115097600 unmapped: 35749888 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192764 data_alloc: 218103808 data_used: 4792320
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115097600 unmapped: 35749888 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115097600 unmapped: 35749888 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115097600 unmapped: 35749888 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115097600 unmapped: 35749888 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115097600 unmapped: 35749888 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192764 data_alloc: 218103808 data_used: 4792320
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115097600 unmapped: 35749888 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115097600 unmapped: 35749888 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115105792 unmapped: 35741696 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115105792 unmapped: 35741696 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115105792 unmapped: 35741696 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192764 data_alloc: 218103808 data_used: 4792320
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115105792 unmapped: 35741696 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115105792 unmapped: 35741696 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115105792 unmapped: 35741696 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115105792 unmapped: 35741696 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115105792 unmapped: 35741696 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192764 data_alloc: 218103808 data_used: 4792320
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115105792 unmapped: 35741696 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115105792 unmapped: 35741696 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115105792 unmapped: 35741696 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115113984 unmapped: 35733504 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115113984 unmapped: 35733504 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192764 data_alloc: 218103808 data_used: 4792320
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115113984 unmapped: 35733504 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115113984 unmapped: 35733504 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115113984 unmapped: 35733504 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115113984 unmapped: 35733504 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115113984 unmapped: 35733504 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192764 data_alloc: 218103808 data_used: 4792320
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115113984 unmapped: 35733504 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115113984 unmapped: 35733504 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115122176 unmapped: 35725312 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115122176 unmapped: 35725312 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115122176 unmapped: 35725312 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192764 data_alloc: 218103808 data_used: 4792320
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115122176 unmapped: 35725312 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115122176 unmapped: 35725312 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115122176 unmapped: 35725312 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115122176 unmapped: 35725312 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115122176 unmapped: 35725312 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192764 data_alloc: 218103808 data_used: 4792320
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115122176 unmapped: 35725312 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115122176 unmapped: 35725312 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115122176 unmapped: 35725312 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115130368 unmapped: 35717120 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115130368 unmapped: 35717120 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192764 data_alloc: 218103808 data_used: 4792320
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115130368 unmapped: 35717120 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115130368 unmapped: 35717120 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115130368 unmapped: 35717120 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115130368 unmapped: 35717120 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115130368 unmapped: 35717120 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192764 data_alloc: 218103808 data_used: 4792320
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115130368 unmapped: 35717120 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115130368 unmapped: 35717120 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115130368 unmapped: 35717120 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115130368 unmapped: 35717120 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115138560 unmapped: 35708928 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192764 data_alloc: 218103808 data_used: 4792320
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115138560 unmapped: 35708928 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115138560 unmapped: 35708928 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115138560 unmapped: 35708928 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115138560 unmapped: 35708928 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115138560 unmapped: 35708928 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192764 data_alloc: 218103808 data_used: 4792320
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115138560 unmapped: 35708928 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115138560 unmapped: 35708928 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115138560 unmapped: 35708928 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115138560 unmapped: 35708928 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115138560 unmapped: 35708928 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192764 data_alloc: 218103808 data_used: 4792320
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115138560 unmapped: 35708928 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115138560 unmapped: 35708928 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115138560 unmapped: 35708928 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115138560 unmapped: 35708928 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115138560 unmapped: 35708928 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192764 data_alloc: 218103808 data_used: 4792320
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115138560 unmapped: 35708928 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115146752 unmapped: 35700736 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115146752 unmapped: 35700736 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115146752 unmapped: 35700736 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115146752 unmapped: 35700736 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192764 data_alloc: 218103808 data_used: 4792320
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115146752 unmapped: 35700736 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115146752 unmapped: 35700736 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115146752 unmapped: 35700736 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115146752 unmapped: 35700736 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115146752 unmapped: 35700736 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192764 data_alloc: 218103808 data_used: 4792320
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115146752 unmapped: 35700736 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115146752 unmapped: 35700736 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115146752 unmapped: 35700736 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115154944 unmapped: 35692544 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115154944 unmapped: 35692544 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192764 data_alloc: 218103808 data_used: 4792320
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115154944 unmapped: 35692544 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115154944 unmapped: 35692544 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115154944 unmapped: 35692544 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115154944 unmapped: 35692544 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115154944 unmapped: 35692544 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192764 data_alloc: 218103808 data_used: 4792320
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115154944 unmapped: 35692544 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115154944 unmapped: 35692544 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115154944 unmapped: 35692544 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115163136 unmapped: 35684352 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115163136 unmapped: 35684352 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192764 data_alloc: 218103808 data_used: 4792320
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115163136 unmapped: 35684352 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115163136 unmapped: 35684352 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115163136 unmapped: 35684352 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115163136 unmapped: 35684352 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115163136 unmapped: 35684352 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192764 data_alloc: 218103808 data_used: 4792320
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115163136 unmapped: 35684352 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115171328 unmapped: 35676160 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115171328 unmapped: 35676160 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115171328 unmapped: 35676160 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115171328 unmapped: 35676160 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192764 data_alloc: 218103808 data_used: 4792320
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115171328 unmapped: 35676160 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115171328 unmapped: 35676160 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115171328 unmapped: 35676160 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115171328 unmapped: 35676160 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115171328 unmapped: 35676160 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192764 data_alloc: 218103808 data_used: 4792320
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115171328 unmapped: 35676160 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115179520 unmapped: 35667968 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115179520 unmapped: 35667968 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115179520 unmapped: 35667968 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 2400.1 total, 600.0 interval#012Cumulative writes: 10K writes, 41K keys, 10K commit groups, 1.0 writes per commit group, ingest: 0.03 GB, 0.01 MB/s#012Cumulative WAL: 10K writes, 2997 syncs, 3.65 writes per sync, written: 0.03 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1703 writes, 5619 keys, 1703 commit groups, 1.0 writes per commit group, ingest: 5.35 MB, 0.01 MB/s#012Interval WAL: 1703 writes, 744 syncs, 2.29 writes per sync, written: 0.01 GB, 0.01 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115179520 unmapped: 35667968 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192764 data_alloc: 218103808 data_used: 4792320
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115179520 unmapped: 35667968 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115179520 unmapped: 35667968 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115179520 unmapped: 35667968 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115179520 unmapped: 35667968 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115187712 unmapped: 35659776 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192764 data_alloc: 218103808 data_used: 4792320
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115187712 unmapped: 35659776 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115187712 unmapped: 35659776 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115187712 unmapped: 35659776 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115187712 unmapped: 35659776 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115187712 unmapped: 35659776 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192764 data_alloc: 218103808 data_used: 4792320
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115187712 unmapped: 35659776 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115187712 unmapped: 35659776 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115187712 unmapped: 35659776 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115187712 unmapped: 35659776 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115187712 unmapped: 35659776 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192764 data_alloc: 218103808 data_used: 4792320
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115195904 unmapped: 35651584 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115195904 unmapped: 35651584 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115195904 unmapped: 35651584 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115195904 unmapped: 35651584 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115195904 unmapped: 35651584 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192764 data_alloc: 218103808 data_used: 4792320
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115195904 unmapped: 35651584 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115195904 unmapped: 35651584 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115195904 unmapped: 35651584 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115195904 unmapped: 35651584 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115195904 unmapped: 35651584 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192764 data_alloc: 218103808 data_used: 4792320
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115204096 unmapped: 35643392 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115204096 unmapped: 35643392 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115204096 unmapped: 35643392 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115204096 unmapped: 35643392 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115204096 unmapped: 35643392 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192764 data_alloc: 218103808 data_used: 4792320
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115204096 unmapped: 35643392 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115204096 unmapped: 35643392 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115204096 unmapped: 35643392 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115204096 unmapped: 35643392 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115212288 unmapped: 35635200 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192764 data_alloc: 218103808 data_used: 4792320
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115212288 unmapped: 35635200 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115212288 unmapped: 35635200 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115212288 unmapped: 35635200 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115212288 unmapped: 35635200 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115212288 unmapped: 35635200 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192764 data_alloc: 218103808 data_used: 4792320
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115212288 unmapped: 35635200 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115212288 unmapped: 35635200 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115212288 unmapped: 35635200 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115212288 unmapped: 35635200 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115220480 unmapped: 35627008 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192764 data_alloc: 218103808 data_used: 4792320
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115220480 unmapped: 35627008 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115220480 unmapped: 35627008 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115220480 unmapped: 35627008 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115220480 unmapped: 35627008 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115220480 unmapped: 35627008 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192764 data_alloc: 218103808 data_used: 4792320
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115220480 unmapped: 35627008 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115220480 unmapped: 35627008 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115228672 unmapped: 35618816 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115228672 unmapped: 35618816 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115228672 unmapped: 35618816 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192764 data_alloc: 218103808 data_used: 4792320
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115228672 unmapped: 35618816 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115228672 unmapped: 35618816 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115228672 unmapped: 35618816 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115228672 unmapped: 35618816 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115228672 unmapped: 35618816 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192764 data_alloc: 218103808 data_used: 4792320
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115228672 unmapped: 35618816 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115236864 unmapped: 35610624 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115236864 unmapped: 35610624 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115236864 unmapped: 35610624 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115236864 unmapped: 35610624 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192764 data_alloc: 218103808 data_used: 4792320
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115236864 unmapped: 35610624 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115236864 unmapped: 35610624 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115236864 unmapped: 35610624 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115236864 unmapped: 35610624 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115236864 unmapped: 35610624 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192764 data_alloc: 218103808 data_used: 4792320
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115245056 unmapped: 35602432 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115245056 unmapped: 35602432 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115245056 unmapped: 35602432 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115245056 unmapped: 35602432 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115245056 unmapped: 35602432 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192764 data_alloc: 218103808 data_used: 4792320
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115245056 unmapped: 35602432 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115245056 unmapped: 35602432 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115245056 unmapped: 35602432 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115245056 unmapped: 35602432 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115245056 unmapped: 35602432 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192764 data_alloc: 218103808 data_used: 4792320
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115245056 unmapped: 35602432 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115245056 unmapped: 35602432 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115253248 unmapped: 35594240 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115253248 unmapped: 35594240 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115253248 unmapped: 35594240 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192764 data_alloc: 218103808 data_used: 4792320
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115253248 unmapped: 35594240 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115253248 unmapped: 35594240 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115253248 unmapped: 35594240 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115253248 unmapped: 35594240 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115253248 unmapped: 35594240 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192764 data_alloc: 218103808 data_used: 4792320
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115261440 unmapped: 35586048 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115261440 unmapped: 35586048 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115261440 unmapped: 35586048 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115261440 unmapped: 35586048 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115261440 unmapped: 35586048 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192764 data_alloc: 218103808 data_used: 4792320
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115261440 unmapped: 35586048 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115261440 unmapped: 35586048 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114933760 unmapped: 35913728 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114941952 unmapped: 35905536 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114941952 unmapped: 35905536 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192764 data_alloc: 218103808 data_used: 4792320
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114941952 unmapped: 35905536 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114941952 unmapped: 35905536 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114941952 unmapped: 35905536 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114941952 unmapped: 35905536 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114941952 unmapped: 35905536 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192764 data_alloc: 218103808 data_used: 4792320
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114941952 unmapped: 35905536 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114941952 unmapped: 35905536 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114941952 unmapped: 35905536 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114941952 unmapped: 35905536 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114941952 unmapped: 35905536 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192764 data_alloc: 218103808 data_used: 4792320
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114950144 unmapped: 35897344 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114950144 unmapped: 35897344 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114950144 unmapped: 35897344 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114950144 unmapped: 35897344 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114950144 unmapped: 35897344 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192764 data_alloc: 218103808 data_used: 4792320
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114950144 unmapped: 35897344 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114950144 unmapped: 35897344 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114950144 unmapped: 35897344 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114950144 unmapped: 35897344 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114958336 unmapped: 35889152 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192764 data_alloc: 218103808 data_used: 4792320
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114958336 unmapped: 35889152 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114958336 unmapped: 35889152 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114958336 unmapped: 35889152 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114958336 unmapped: 35889152 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114958336 unmapped: 35889152 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192764 data_alloc: 218103808 data_used: 4792320
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114958336 unmapped: 35889152 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114958336 unmapped: 35889152 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114966528 unmapped: 35880960 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114966528 unmapped: 35880960 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114966528 unmapped: 35880960 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192764 data_alloc: 218103808 data_used: 4792320
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114966528 unmapped: 35880960 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114966528 unmapped: 35880960 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114966528 unmapped: 35880960 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114966528 unmapped: 35880960 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114966528 unmapped: 35880960 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192764 data_alloc: 218103808 data_used: 4792320
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114966528 unmapped: 35880960 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114966528 unmapped: 35880960 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114974720 unmapped: 35872768 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114974720 unmapped: 35872768 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114974720 unmapped: 35872768 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192764 data_alloc: 218103808 data_used: 4792320
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114974720 unmapped: 35872768 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114974720 unmapped: 35872768 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114974720 unmapped: 35872768 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 272.946929932s of 272.950836182s, submitted: 1
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114982912 unmapped: 35864576 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115032064 unmapped: 35815424 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115089408 unmapped: 35758080 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115179520 unmapped: 35667968 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115204096 unmapped: 35643392 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115310592 unmapped: 35536896 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115310592 unmapped: 35536896 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115310592 unmapped: 35536896 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115310592 unmapped: 35536896 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115310592 unmapped: 35536896 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115310592 unmapped: 35536896 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115310592 unmapped: 35536896 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115310592 unmapped: 35536896 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115310592 unmapped: 35536896 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115310592 unmapped: 35536896 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115310592 unmapped: 35536896 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115310592 unmapped: 35536896 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115310592 unmapped: 35536896 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115310592 unmapped: 35536896 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115310592 unmapped: 35536896 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115310592 unmapped: 35536896 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115310592 unmapped: 35536896 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115310592 unmapped: 35536896 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115310592 unmapped: 35536896 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115310592 unmapped: 35536896 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115310592 unmapped: 35536896 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115310592 unmapped: 35536896 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115310592 unmapped: 35536896 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115310592 unmapped: 35536896 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115310592 unmapped: 35536896 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115310592 unmapped: 35536896 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115318784 unmapped: 35528704 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115326976 unmapped: 35520512 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115335168 unmapped: 35512320 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115335168 unmapped: 35512320 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115335168 unmapped: 35512320 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115335168 unmapped: 35512320 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115335168 unmapped: 35512320 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115335168 unmapped: 35512320 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115335168 unmapped: 35512320 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115335168 unmapped: 35512320 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115335168 unmapped: 35512320 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115335168 unmapped: 35512320 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115335168 unmapped: 35512320 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115335168 unmapped: 35512320 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115335168 unmapped: 35512320 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115335168 unmapped: 35512320 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115335168 unmapped: 35512320 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115335168 unmapped: 35512320 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115335168 unmapped: 35512320 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115335168 unmapped: 35512320 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115335168 unmapped: 35512320 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115335168 unmapped: 35512320 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115335168 unmapped: 35512320 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115335168 unmapped: 35512320 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115343360 unmapped: 35504128 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115343360 unmapped: 35504128 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115351552 unmapped: 35495936 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115351552 unmapped: 35495936 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115351552 unmapped: 35495936 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115351552 unmapped: 35495936 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115351552 unmapped: 35495936 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115351552 unmapped: 35495936 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115351552 unmapped: 35495936 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115351552 unmapped: 35495936 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115351552 unmapped: 35495936 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115359744 unmapped: 35487744 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115359744 unmapped: 35487744 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115359744 unmapped: 35487744 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115359744 unmapped: 35487744 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115359744 unmapped: 35487744 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115359744 unmapped: 35487744 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115359744 unmapped: 35487744 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115359744 unmapped: 35487744 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115367936 unmapped: 35479552 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115367936 unmapped: 35479552 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115367936 unmapped: 35479552 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115367936 unmapped: 35479552 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115367936 unmapped: 35479552 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115367936 unmapped: 35479552 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115367936 unmapped: 35479552 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115367936 unmapped: 35479552 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115367936 unmapped: 35479552 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 35471360 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 35471360 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 35471360 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 35471360 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 35471360 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 35471360 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 35471360 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 35471360 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 35471360 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115384320 unmapped: 35463168 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115384320 unmapped: 35463168 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115384320 unmapped: 35463168 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115384320 unmapped: 35463168 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115384320 unmapped: 35463168 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115384320 unmapped: 35463168 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115384320 unmapped: 35463168 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115384320 unmapped: 35463168 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115392512 unmapped: 35454976 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115392512 unmapped: 35454976 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115392512 unmapped: 35454976 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115392512 unmapped: 35454976 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115392512 unmapped: 35454976 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115392512 unmapped: 35454976 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115392512 unmapped: 35454976 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115392512 unmapped: 35454976 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115392512 unmapped: 35454976 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115392512 unmapped: 35454976 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115392512 unmapped: 35454976 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115392512 unmapped: 35454976 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115400704 unmapped: 35446784 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115400704 unmapped: 35446784 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115400704 unmapped: 35446784 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115400704 unmapped: 35446784 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115400704 unmapped: 35446784 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115400704 unmapped: 35446784 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115400704 unmapped: 35446784 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115400704 unmapped: 35446784 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115400704 unmapped: 35446784 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115400704 unmapped: 35446784 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115400704 unmapped: 35446784 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115400704 unmapped: 35446784 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115408896 unmapped: 35438592 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115408896 unmapped: 35438592 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115408896 unmapped: 35438592 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115408896 unmapped: 35438592 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115408896 unmapped: 35438592 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115408896 unmapped: 35438592 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115408896 unmapped: 35438592 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115408896 unmapped: 35438592 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115417088 unmapped: 35430400 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115417088 unmapped: 35430400 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115417088 unmapped: 35430400 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115417088 unmapped: 35430400 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115417088 unmapped: 35430400 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115417088 unmapped: 35430400 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115417088 unmapped: 35430400 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115417088 unmapped: 35430400 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115417088 unmapped: 35430400 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115417088 unmapped: 35430400 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115425280 unmapped: 35422208 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115425280 unmapped: 35422208 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115425280 unmapped: 35422208 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115425280 unmapped: 35422208 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115425280 unmapped: 35422208 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115425280 unmapped: 35422208 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115433472 unmapped: 35414016 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115433472 unmapped: 35414016 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115433472 unmapped: 35414016 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115433472 unmapped: 35414016 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115433472 unmapped: 35414016 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115433472 unmapped: 35414016 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115433472 unmapped: 35414016 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115433472 unmapped: 35414016 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115433472 unmapped: 35414016 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115433472 unmapped: 35414016 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115433472 unmapped: 35414016 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115433472 unmapped: 35414016 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115433472 unmapped: 35414016 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115441664 unmapped: 35405824 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115441664 unmapped: 35405824 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115441664 unmapped: 35405824 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115441664 unmapped: 35405824 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115441664 unmapped: 35405824 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115441664 unmapped: 35405824 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115441664 unmapped: 35405824 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115441664 unmapped: 35405824 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115449856 unmapped: 35397632 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115449856 unmapped: 35397632 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115449856 unmapped: 35397632 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115449856 unmapped: 35397632 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115449856 unmapped: 35397632 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115449856 unmapped: 35397632 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115449856 unmapped: 35397632 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115449856 unmapped: 35397632 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115449856 unmapped: 35397632 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115449856 unmapped: 35397632 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115449856 unmapped: 35397632 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115449856 unmapped: 35397632 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115449856 unmapped: 35397632 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115449856 unmapped: 35397632 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115449856 unmapped: 35397632 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115449856 unmapped: 35397632 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115449856 unmapped: 35397632 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115449856 unmapped: 35397632 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115458048 unmapped: 35389440 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115458048 unmapped: 35389440 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115458048 unmapped: 35389440 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115458048 unmapped: 35389440 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115458048 unmapped: 35389440 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115458048 unmapped: 35389440 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115458048 unmapped: 35389440 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115458048 unmapped: 35389440 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115458048 unmapped: 35389440 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115458048 unmapped: 35389440 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115458048 unmapped: 35389440 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115458048 unmapped: 35389440 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115458048 unmapped: 35389440 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115458048 unmapped: 35389440 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115458048 unmapped: 35389440 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115466240 unmapped: 35381248 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115466240 unmapped: 35381248 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115466240 unmapped: 35381248 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115466240 unmapped: 35381248 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115466240 unmapped: 35381248 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115466240 unmapped: 35381248 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115466240 unmapped: 35381248 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115466240 unmapped: 35381248 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115466240 unmapped: 35381248 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115466240 unmapped: 35381248 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115466240 unmapped: 35381248 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115466240 unmapped: 35381248 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115466240 unmapped: 35381248 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115466240 unmapped: 35381248 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115466240 unmapped: 35381248 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115466240 unmapped: 35381248 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115466240 unmapped: 35381248 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115466240 unmapped: 35381248 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115474432 unmapped: 35373056 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115474432 unmapped: 35373056 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115474432 unmapped: 35373056 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115474432 unmapped: 35373056 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115474432 unmapped: 35373056 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115474432 unmapped: 35373056 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115474432 unmapped: 35373056 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115474432 unmapped: 35373056 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115474432 unmapped: 35373056 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115474432 unmapped: 35373056 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115474432 unmapped: 35373056 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115474432 unmapped: 35373056 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115474432 unmapped: 35373056 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115474432 unmapped: 35373056 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115474432 unmapped: 35373056 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115474432 unmapped: 35373056 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115482624 unmapped: 35364864 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115482624 unmapped: 35364864 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115482624 unmapped: 35364864 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115482624 unmapped: 35364864 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115482624 unmapped: 35364864 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115482624 unmapped: 35364864 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115482624 unmapped: 35364864 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115482624 unmapped: 35364864 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115482624 unmapped: 35364864 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115482624 unmapped: 35364864 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115482624 unmapped: 35364864 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115482624 unmapped: 35364864 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115482624 unmapped: 35364864 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115482624 unmapped: 35364864 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115482624 unmapped: 35364864 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115482624 unmapped: 35364864 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115482624 unmapped: 35364864 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115482624 unmapped: 35364864 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115482624 unmapped: 35364864 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115482624 unmapped: 35364864 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115482624 unmapped: 35364864 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115482624 unmapped: 35364864 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115482624 unmapped: 35364864 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115482624 unmapped: 35364864 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115482624 unmapped: 35364864 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115482624 unmapped: 35364864 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115482624 unmapped: 35364864 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115482624 unmapped: 35364864 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115482624 unmapped: 35364864 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115482624 unmapped: 35364864 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115490816 unmapped: 35356672 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115490816 unmapped: 35356672 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115490816 unmapped: 35356672 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115490816 unmapped: 35356672 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115490816 unmapped: 35356672 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115490816 unmapped: 35356672 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115490816 unmapped: 35356672 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115490816 unmapped: 35356672 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115490816 unmapped: 35356672 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115490816 unmapped: 35356672 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115490816 unmapped: 35356672 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115490816 unmapped: 35356672 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115490816 unmapped: 35356672 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115490816 unmapped: 35356672 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115490816 unmapped: 35356672 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115490816 unmapped: 35356672 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115490816 unmapped: 35356672 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115490816 unmapped: 35356672 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115490816 unmapped: 35356672 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115490816 unmapped: 35356672 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115490816 unmapped: 35356672 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115490816 unmapped: 35356672 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115490816 unmapped: 35356672 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115490816 unmapped: 35356672 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115490816 unmapped: 35356672 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb25e58c00 session 0x55fb26c18d20
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb235d5c00 session 0x55fb26f51e00
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115490816 unmapped: 35356672 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115490816 unmapped: 35356672 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2417cc00 session 0x55fb26f51680
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115490816 unmapped: 35356672 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115490816 unmapped: 35356672 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb261b7800 session 0x55fb26c19c20
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115490816 unmapped: 35356672 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115499008 unmapped: 35348480 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115499008 unmapped: 35348480 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115499008 unmapped: 35348480 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115499008 unmapped: 35348480 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115499008 unmapped: 35348480 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115499008 unmapped: 35348480 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115499008 unmapped: 35348480 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115499008 unmapped: 35348480 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115499008 unmapped: 35348480 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115499008 unmapped: 35348480 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115499008 unmapped: 35348480 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115499008 unmapped: 35348480 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115499008 unmapped: 35348480 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115499008 unmapped: 35348480 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115499008 unmapped: 35348480 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115499008 unmapped: 35348480 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115499008 unmapped: 35348480 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115499008 unmapped: 35348480 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115499008 unmapped: 35348480 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115499008 unmapped: 35348480 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115499008 unmapped: 35348480 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115499008 unmapped: 35348480 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115499008 unmapped: 35348480 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115499008 unmapped: 35348480 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115499008 unmapped: 35348480 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115499008 unmapped: 35348480 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115499008 unmapped: 35348480 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115499008 unmapped: 35348480 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115499008 unmapped: 35348480 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115499008 unmapped: 35348480 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: do_command 'config diff' '{prefix=config diff}'
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: do_command 'config show' '{prefix=config show}'
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115859456 unmapped: 34988032 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: do_command 'counter dump' '{prefix=counter dump}'
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: do_command 'counter schema' '{prefix=counter schema}'
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115482624 unmapped: 35364864 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115851264 unmapped: 34996224 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec  6 05:27:31 np0005548916 ceph-osd[77465]: do_command 'log dump' '{prefix=log dump}'
Dec  6 05:27:31 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr module ls"} v 0)
Dec  6 05:27:31 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1485346110' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Dec  6 05:27:31 np0005548916 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec  6 05:27:31 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr services"} v 0)
Dec  6 05:27:31 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/795191216' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Dec  6 05:27:32 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:27:31 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:27:32 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:27:31 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:27:32 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:27:31 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:27:32 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:27:32 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:27:32 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr versions"} v 0)
Dec  6 05:27:32 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3485379357' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Dec  6 05:27:32 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon stat"} v 0)
Dec  6 05:27:32 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/568451492' entity='client.admin' cmd=[{"prefix": "mon stat"}]: dispatch
Dec  6 05:27:32 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:27:32 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:27:32 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:27:32.802 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:27:32 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:27:32 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:27:32 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:27:32.820 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:27:33 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "node ls"} v 0)
Dec  6 05:27:33 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1892291010' entity='client.admin' cmd=[{"prefix": "node ls"}]: dispatch
Dec  6 05:27:34 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:27:34 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush class ls"} v 0)
Dec  6 05:27:34 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2167965242' entity='client.admin' cmd=[{"prefix": "osd crush class ls"}]: dispatch
Dec  6 05:27:34 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "log last", "channel": "cephadm", "format": "json-pretty"} v 0)
Dec  6 05:27:34 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/167369165' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm", "format": "json-pretty"}]: dispatch
Dec  6 05:27:34 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush dump"} v 0)
Dec  6 05:27:34 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1798232326' entity='client.admin' cmd=[{"prefix": "osd crush dump"}]: dispatch
Dec  6 05:27:34 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:27:34 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:27:34 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:27:34.805 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:27:34 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:27:34 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:27:34 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:27:34.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:27:34 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush rule ls"} v 0)
Dec  6 05:27:34 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2723227824' entity='client.admin' cmd=[{"prefix": "osd crush rule ls"}]: dispatch
Dec  6 05:27:34 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr dump", "format": "json-pretty"} v 0)
Dec  6 05:27:34 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3073635263' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json-pretty"}]: dispatch
Dec  6 05:27:35 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush show-tunables"} v 0)
Dec  6 05:27:35 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1172937656' entity='client.admin' cmd=[{"prefix": "osd crush show-tunables"}]: dispatch
Dec  6 05:27:35 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr metadata", "format": "json-pretty"} v 0)
Dec  6 05:27:35 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2216123752' entity='client.admin' cmd=[{"prefix": "mgr metadata", "format": "json-pretty"}]: dispatch
Dec  6 05:27:35 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush tree", "show_shadow": true} v 0)
Dec  6 05:27:35 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3541376245' entity='client.admin' cmd=[{"prefix": "osd crush tree", "show_shadow": true}]: dispatch
Dec  6 05:27:35 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr module ls", "format": "json-pretty"} v 0)
Dec  6 05:27:35 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1407133659' entity='client.admin' cmd=[{"prefix": "mgr module ls", "format": "json-pretty"}]: dispatch
Dec  6 05:27:36 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd erasure-code-profile ls"} v 0)
Dec  6 05:27:36 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3862076468' entity='client.admin' cmd=[{"prefix": "osd erasure-code-profile ls"}]: dispatch
Dec  6 05:27:36 np0005548916 systemd[1]: Starting Hostname Service...
Dec  6 05:27:36 np0005548916 systemd[1]: Started Hostname Service.
Dec  6 05:27:36 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr services", "format": "json-pretty"} v 0)
Dec  6 05:27:36 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4109644866' entity='client.admin' cmd=[{"prefix": "mgr services", "format": "json-pretty"}]: dispatch
Dec  6 05:27:36 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd metadata"} v 0)
Dec  6 05:27:36 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1301683355' entity='client.admin' cmd=[{"prefix": "osd metadata"}]: dispatch
Dec  6 05:27:36 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:27:36 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 05:27:36 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:27:36.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 05:27:36 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr stat", "format": "json-pretty"} v 0)
Dec  6 05:27:36 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/475292124' entity='client.admin' cmd=[{"prefix": "mgr stat", "format": "json-pretty"}]: dispatch
Dec  6 05:27:36 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:27:36 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:27:36 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:27:36.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:27:36 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd utilization"} v 0)
Dec  6 05:27:36 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1883974811' entity='client.admin' cmd=[{"prefix": "osd utilization"}]: dispatch
Dec  6 05:27:37 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:27:36 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:27:37 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:27:37 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:27:37 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:27:37 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:27:37 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:27:37 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:27:37 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr versions", "format": "json-pretty"} v 0)
Dec  6 05:27:37 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/923492754' entity='client.admin' cmd=[{"prefix": "mgr versions", "format": "json-pretty"}]: dispatch
Dec  6 05:27:38 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "quorum_status"} v 0)
Dec  6 05:27:38 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/897535951' entity='client.admin' cmd=[{"prefix": "quorum_status"}]: dispatch
Dec  6 05:27:38 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:27:38 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:27:38 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:27:38.809 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:27:38 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:27:38 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:27:38 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:27:38.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:27:39 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:27:39 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "versions"} v 0)
Dec  6 05:27:39 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3518358618' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch
Dec  6 05:27:39 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd tree", "format": "json-pretty"} v 0)
Dec  6 05:27:39 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2997084197' entity='client.admin' cmd=[{"prefix": "osd tree", "format": "json-pretty"}]: dispatch
Dec  6 05:27:39 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "health", "detail": "detail", "format": "json-pretty"} v 0)
Dec  6 05:27:39 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/458820190' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail", "format": "json-pretty"}]: dispatch
Dec  6 05:27:40 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd tree", "format": "json-pretty"} v 0)
Dec  6 05:27:40 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1206616175' entity='client.admin' cmd=[{"prefix": "osd tree", "format": "json-pretty"}]: dispatch
Dec  6 05:27:40 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec  6 05:27:40 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec  6 05:27:40 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Dec  6 05:27:40 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Dec  6 05:27:40 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec  6 05:27:40 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec  6 05:27:40 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Dec  6 05:27:40 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Dec  6 05:27:40 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:27:40 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:27:40 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:27:40.812 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:27:40 np0005548916 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec  6 05:27:40 np0005548916 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:27:40 np0005548916 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:27:40.832 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:27:40 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec  6 05:27:40 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec  6 05:27:41 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Dec  6 05:27:41 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Dec  6 05:27:41 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "config dump"} v 0)
Dec  6 05:27:41 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2114392114' entity='client.admin' cmd=[{"prefix": "config dump"}]: dispatch
Dec  6 05:27:42 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:27:41 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:27:42 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:27:41 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:27:42 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:27:41 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:27:42 np0005548916 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:27:42 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:27:42 np0005548916 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "detail": "detail"} v 0)
Dec  6 05:27:42 np0005548916 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/219266435' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail"}]: dispatch
